I have unintentionally raised a large debate recently concerning the question if it is legal in C/C++ to use the &P->m_foo expression with P being a null pointer. The programmers’ community… Read more
Recent Blog Posts
The Role of Technology in Retail: Is it enough?
My friend and colleague, Jon Stine, @joncstine1 recently penned a blog regarding technology in retail. Jon has extensive retail industry and technology expertise and offers a great perspective on the role of technology to address challenges retailers face. The challenge from my perspective for retailers with store fronts, is best summed in a question. Can you deliver a killer shopping experience – from sofa-to-store aisle?” Sadly many retailers are not able to answer yes this question. The vast majority of the retailers don’t invest in innovative shopping solutions. Many retailers are content to follow the same old formula of price discounting, coupons and Sunday circulars. A well proven formula that is just too hard to break from. However, it is a formula we know no longer fits the new connected consumer.
The evolution of the connected consumer has been highlighted in popular press at great length for at least the last three plus years. However, many retailers have missed the evolution of the consumer. You know, the Millenials generation. It will comprise 75% of the workforce by 2020 and command over $2.5T in purchasing power. This segment is always connected and values experiences as much as price. And by the way, since they are always connected, they never shop without their device. Yes, the evolution has been occurring for some time and yes the Millenials are reshaping the shopping experience. What are you going to do about it?
Retailers are facing a strategic inflection point, which could mean an opportunity to prosper or a slow ride toward demise. At least that is my point-of-view. Jon is arguing a few factors that are relevant to creating an innovative shopping experience.
- “Showrooming” is multidirectional (in-store and online) and it is here to stay.
- Leveraging big data can have a profound impact on your brand and the experience you deliver – it should be considered as the starting point as you create a new shoppers journey.
- Security must become a strategic imperative for the way you conduct business – trust is won in drips and lost in buckets. Cybercriminals are well funded and profitable and hacking will continue as the new normal.
As mentioned in Jon’s blog, retailers have long chosen to focus on maintaining their ongoing operations, rather than investing for growth and innovation. Growth and Innovation don’t come cheap. As a matter of fact, growth and innovation are more than a technology roadmap. It is a business strategy. Why do consumers flock to Amazon or any of the “new concept” stores? I argue it boils down to the experience. Amazon provides the ultimate in clienteling and sales assist. The new concept stores I had the privilege to tour in NYC offer innovative shopping experiences. My store tour was during NRF 2015.
The collection of stores we visited all offered a unique & engaging shopping experiences.
Rebecca Minkoff – connected fashion store with technology envisioned and planed by EBAY. Bringing together the online experience to the physical store. In store interactive shopping display. Once the shopper selects the clothes they want to try on they tap a button to have a sales associate bring the items to a dressing room.
Under Armour Brand House – to create a physical space that becomes a destination for shoppers. The strategy for the stores is more about telling a story and engaging the shopper through story telling. UA founder Kevin Plank is more interested in aligning its product communication and retail presentation than anything else. His claim is that UA focuses 80% on storytelling and 20% on product – just the opposite of so many other product retailers.
Converse – yup that old classic, Chuck Taylor canvas shoe. Converse has been offering online customization for some time. But what if you wanted an immediate and unique shoe to wear to an event. Now you can visit a Converse store, select your favorite Chuck’s and then set off in creating your own personalized style.
Similar to the way Amazon offers a unique shopping experience these stores invested in delivering an innovation. It wasn’t a technology solution alone – it was a desire from top to bottom to give the shopper something unique and innovative.
Do you want help in delivering growth and innovation in your retail environment? Intel isn’t going to solve all of this on its own. We work with very talented fellow travelers that offer solutions to achieve growth and innovation.
Graylish, Gordon, Edgecombe, Paul J, Steuart, Ann M, Walsh, Megan A, Emde, Charles, Yep, Ray S, Snyder, Don P, Martin, Lisa A,Malloy, Steve,, Julie, Cavallo, Jerry,Phillips, Todd,, Dastghaib, Hooman, Gledhill, Alexander N, Karolkowski, Gilles,Aillerie, Yves,Horsthemke, Uwe, Calandra, Joseph, Vandenplas, Patricia, , Gary,Shean, Robyn, Bakkeren, Matty,Butcher, Paul,Brown, Steve PowerYep, Ray S, Bhasin, Rahoul, Dastghaib, Hooman , Cangro-essary-coons, Lisa, Archer, Darin,Robason, Kelly,Pitarresi, Joe, Nickles, Annabel SDastghaib, Hooman, Peutin, Florence @Ward, Matthew, @Williams, La Tiffaney, @Fox, Tania, @Lester, Ryan, @Weiskus, Sarah,, Mushahwar, Rachel K, Poulin, Shannon, @Tea, Peter, @Webb, Victor; Laura Barbaro -, Pattie.Sims@intel.com,
FLOPS means total floating point operations per second, which is used in High Performance Computing. In general, Intel(R) VTune(TM) Amplifier XE
only provides metric named Cycles Per Instruction… Read more
Part 1: 8 Ways to Secure Your Cloud Infrastructure
Cloud security remains a top concern for businesses. Fortunately, today’s data center managers have an arsenal of weapons at their disposal to secure their private cloud infrastructure.
Here are eight things you can use to secure your private cloud.
1. AES-NI Data Encryption
End-to-end encryption can be transformational for the private cloud, securing data at all levels through enterprise-class encryption. The latest Intel processors feature Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), a set of new instructions that enhance performance by speeding up the execution of encryption algorithms.
The instructions are built into Intel® Xeon server processors as well as client platforms including mobile devices.
When encryption software utilises them, the AES-NI instructions dramatically accelerate encryption and decryption – by up to 10 times compared with software-only AES.
This speedy encryption means that it is possible to incorporate encryption across the data centre without significantly impacting infrastructure performance.
2. Security Protocols
By incorporating a range of security protocols and secure connections, you will build a more secure private cloud.
As well as encrypting data, clouds can also use cryptographic protocols to secure browser access to the customer portal, and to transfer encrypted data.
For example, Transport Layer Security (TLS) and Secure Sockets Layer (SSL) protocols are used to assure safe communications over networks, including the Internet. Both of these are widely used for application such as secure web browsing, through HTTPS, as well as email, IM and VoIP.
They are also critical for cloud computing, enabling applications to communicate over the network and throughout the cloud while preventing undetected tampering that modifies content, or eavesdropping on content as it’s transferred.
3. OpenSSL, RSAX and Function Stitching
Intel works closely with OpenSSL, a popular open source multiplatform security library. OpenSSL is FIPS 140-2 certified: a computer security standard developed by the National Institute of Standards and Technology Cryptographic Module Validation Program.
It can be used to secure web transactions through services such as Gmail, e-commerce platforms and Facebook, to safeguard connections on Intel architecture.
Two functions of OpenSSL, that Intel has contributed to, are RSAX and function stitching.
The first is a unique implementation of the popular RSA 1024-bit algorithm, and produces significantly better performance than previous OpenSSL implementations. RSAX can accelerate the time it takes to initiate an SSL session – up to 1.5 times. This provides a better user experience and increases the number of simultaneous sessions your server can handle.
As for function stitching: bulk data buffers use two algorithms for encryption and authentication, but rather than encrypting and authenticating data serially, function stitching interleaves instructions from these two algorithms. By executing them simultaneously, it improves the utilisation of execution resources and boosts performance.
Function stitching can result in up to 4.8 times performance improvement for secure web servers when combined with RSAX and Intel AES-NI.
4. Data Loss Prevention (DLP)
Data protection is rooted in the encryption and secure transfer of data. Data loss prevention (DLP) is a complementary approach focused on detecting and preventing the leakage of sensitive information, either by malicious intent or inadvertent mistake.
DLP solutions can profile content against rules and capture violations or index and analyse data to develop new rules. IT can establish policies that govern how data is used in the organisation and by whom. By doing this they can clarify security practices, identify potential fraud and avert accidental or unauthorised malicious transfer of information.
An example of this technology is McAfee Total Protection for Data Loss Prevention. This software can be used to support an organisation’s governance policies.
Protecting your platform begins with managing the users who access your cloud. This is a large undertaking because of the array of external and internal applications, and the continual churn of employees.
Ideally, authentication is strengthened by routing it in hardware. With Intel Identity Protection Technology (Intel IPT), Intel has built tamper-resistant, two-factor authentication directly into PCs based on third-generation Intel core vPro processors, as well as Ultrabook devices.
Intel IPT offers token generation built into the hardware, eliminating the need for a separate physical token. Third-party software applications work in tandem with the hardware, strengthening the authentication process.
Through Intel IPT technology, businesses can secure their access points by using one-time passwords or public key infrastructure.
6. API-level Controls
Another way in which you can secure your cloud infrastructure is by enforcingAPI-level controls. The API gateway layer is where security policy enforcement and cloud service orchestration and integration take place. An increased need to expose application services to third parties, and mobile applications is driving the need for controlled, compliant application service governance.
WithAPI-level controls, you gain a measure of protection for your departmental and edge system infrastructure, and reduce the risk of content-born attacks on applications.
Intel Expressway Service Gateway is an example of a scalable software appliance that provides enforcement points and authenticates API requests against existing enterprise identity and access management system.
7. Trusted Servers and Compute Pools
Because of cloud computing’s reliance on virtualisation, it is essential to establish trust in the cloud. This can be achieved by creating trusted servers and compute pools. Intel Trusted Execution Technology (TXT) builds trust into each server, at the server level, by establishing a root of trust that helps assure system integrity within each system.
The technology checks hypervisor integrity at launch by measuring the code of the hypervisor and comparing it to a known good value. Launch can be blocked if the measurements do not match.
8. Secure Architecture Based on TXT
It’s possible to create a secure cloud architecture based on TXT technology, which is embedded in the hardware of Intel Xeon processor-based servers. Intel TXT works with the layers of the security stack to protect infrastructure, establish trust and verify adherence to security standards.
As mentioned, it works with the hypervisor layer, and also the cloud orchestration layer, the security policy management layer and the Security Information and Event Management (SIEM), and Governance, Risk Management and Compliance (GRC) layer.
Cloud security has come a long way. It’s now possible, through the variety of tools and technologies outlined above, to adequately secure both your data and your user. In so doing, you will establish security and trust in the cloud and gain from the agility, efficiency and cost savings that cloud computing brings.
Ever wanted to use your mobile phone to remotely start your DeLorean? Well, now you can! Thanks to Jason Fox, Microsoft Evangelist, for sharing this project from this weekend’s Hackster.io Hardware Weekend, Dallas edition. There you have it folks – … Read more >
The post Start your DeLorean using Microsoft Azure and Intel Edison! appeared first on Intel Software and Services.
April 19, 2015 marks the 50th anniversary of the publication of Moore’s Law — generally stated as the doubling of the number of device components on a silicon chip every two years. Most conversations on the subject tend to center around … Read more >
Security is not relevant, until it fails. This is the basis for many of the recurring cycles we have seen in cybersecurity. New technology rushed to market is easily compromised by attackers, resulting in impacts that drive the demand for security, and the bolt-on solutions begin to emerge. It is becoming evident this trend is not sustainable with the flood of more devices and significant growth of attackers capabilities. The bad guys have a growing advantage. It is time for the industry to change. Bruce Schneier reinforces the point in the CIO article “Schneier on ‘really bad’ IoT security: ‘It’s going to come crashing down’” about IoT security.
Although the problem is not limited to the Internet of Things, the IoT revolution promises a plethora of devices to integrate within our lives and in the process collect data, providing recommendations, extending what we can control, and serving up meaningful information right when it is needed. But these devices, just as the familiar computers we use everyday, are subject to vulnerabilities.
Hackers and responding faster at compromising new software, operating systems, and even hardware. The trend will become more prevalent and expand beyond heavy compute platforms to also include the smaller IoT devices, wearables, home automation, industrial controls, and vehicle technology which will proliferate in the coming years. The development of tools, practices, and best-known-methods for vulnerability inspection for these new use-cases will accelerate, allowing for attacks to occur faster and deeper into the stack.
We all play a role in how this cycle will unfold. Standards bodies can choose to institute strong security controls to establish a strong defensive baseline or they can default to lower barriers of entry to encourage rapid adoption. Device manufacturers and solution providers can choose to implement robust quality and testing as part of a secure design life-cycle or choose to cut corners in order get-to-market faster. Security firms can be proactive in developing solutions which anticipate attacker’s likely maneuvers or play it safe and wait for impacts to drive the demand from customers. Consumers can vote with their purchases to require good security of products or blindly buy without concern. We all have a role and can influence how the history of emerging technology will be written. Where do you stand?
For more of my rant, watch my Rethinking Cybersecurity Strategy video at the CTO Forum event where I challenge the top minds in technology to consider their responsibility and what is needed to change course to a more secure future.
IT Peer Network: My Previous Posts
Ready or Not, Cross-Channel Shopping Is Here to Stay
Of all the marketplace transitions that have swept through the developed world’s retail industry over the last five to seven years, the most important is the behavioral shift to cross-channel shopping.
The story is told in these three data points1:
- 60 plus percent of U.S. shoppers (and a higher number in the U.K.) regularly begin their shopping journey online.
- Online ratings and reviews have the greatest impact on shopper purchasing decisions, above friends and family, and have four to five times greater impact than store associates.
- Nearly 90 percent of all retail revenue is carried out in the store.
Retail today is face-to-face with a shopper who’s squarely at the intersection of e-commerce, an ever-present smartphone, and an always-on connection to the Internet.
Few retailers are blind to the big behavioral shift. Most brands are responding with strategic omni-channel investments that seek to erase legacy channel lines between customer databases, inventories, vendor lists, and promotions.
Channel-centric organizations are being trimmed, scrubbed, or reshaped. There’s even a willingness — at least among some far-sighted brands — to deal head-on with the thorny challenge of revenue recognition.
All good. All necessary.
Redefining the Retail Space
But, as far as I can tell, only a handful of leaders are asking the deeper question: what, exactly, is the new definition of the store?
What is the definition of the store when the front door to the brand is increasingly online?
What is the definition of the store when shoppers know more than the associates, and when the answer to the question of how and why becomes — at the point of purchase — more important than what and how much?
What is the definition of the store beyond digital? Or of a mash-up of the virtual and physical?
What is the definition — not of brick-and-mortar and shelves and aisles and four-ways and displays — but of differentiating value delivery?
This is a topic we’re now exploring through whiteboard sessions and analyst and advisor discussions. We’re hard at work reviewing the crucial capabilities that will drive the 2018 cross-brand architecture.
Stay tuned. I’ll be sharing my hypotheses (and findings) as I forge ahead.
Global Director, Retail Sales
This is the second installment of the Tech in Retail series.
Click here to view: blog #3
To view more posts within the series click here: Tech & Finance Series
1 National Retail Federation. “2015 National Retail Federation Data.” 06 January 2015.
How did your desk change as a result of developing Internet of Things (IoT) projects and prototypes?
I’m sure that we’ve all once played the “spot the difference” game.
Sometimes there are just… Read more
Over the last several years, Intel IT has been implementing the Information Technology Infrastructure Library (ITIL) framework to transform our service delivery and enable us to align more effectively with the strategies and priorities of each of Intel’s lines of business (LOBs). In doing so, we can focus on high-priority activities that may potentially transform Intel’s entire business and boost the relevancy of IT. As the Chief of Staff for Product Development IT and the Director of Business Solutions Integration for Intel IT, I’m looking forward to meeting with others who have found the same value in using this practice or are considering starting that journey.
Intel IT at the Forefront of Business Relationship Management
From the top down, Intel IT fully understands the importance of business relationship management. In the last 18 months, we have transitioned from an organization loosely coupled to the business to one directly aligned with the business, literally sitting at the table to help make key business decisions.
—Vaughan Merlyn, co-founder of the Business Relationship Management Institute
In 2013, Intel’s CIO, Kim Stevenson, personally asked each LOB to include an IT general manager (GM) on their staff. This suggestion was met favorably by the LOBs, who saw tremendous value in connecting more formally and more closely with IT.
Intel IT has adopted a user-centered approach to delivering IT services that enables us to optimize our IT solutions, improve employee productivity, and increase business velocity. Our user-centered approach involves proactively engaging and partnering with Intel employees and business groups to learn about their needs for information, technology, and services, as well as desired experience. ITIL has been integral in placing the customer at the center, and our new Business Solutions Integration (BSI) service aligns with our user-centered IT strategy. It integrates business relationship management and business demand management, presenting the LOBs with a “One IT” view. Each LOB has a dedicated IT LOB GM, along with other dedicated IT staff that form that LOB’s core IT team: a business relationship manager, a principal engineer, and a finance controller.
“The day I’m representing Intel’s LOB more than my day job, I’ve arrived.”
—Intel IT Staff Member
With a single point of contact for IT, the LOBs can more easily request services. But more important, IT is attuned to the LOB’s strategies, priorities, and pain points. We’ve slashed the time it takes us to say “yes” or “no” to a business request from an average of 36 hours to 8 hours, and our level of support has improved dramatically, according to annual Partnership Excellence surveys.
Run, Grow, Transform
IT used to be thought of as the organization that kept the lights on and the business running, building tools when necessary. But here at Intel, while Intel IT does indeed keep the business running, our best value lies in proactively collaborating with our customers. Therefore, instead of only focusing exclusively on “Run” activities (such as providing network connectivity), we also actively pursue “Grow” and “Transform” activities.
In the “Grow” category, for example, we conduct proofs of concept (PoCs) and enterprise early adoption tests for emerging technologies. Even more valuable are our “Transform” activities, where we are directly involved in co-creating marketable products with our product groups and providing Intel with competitive advantage.
Our BSI service incorporates these higher-value activities through its integration with the IT2Intel program. I’ll explore each of these activities in more detail in future blogs. But briefly, our IT2Intel program enables us to accelerate Intel’s growth in enterprise markets by leveraging Intel IT’s expertise in partnership with Intel product groups.
Shifting with the Business
Our close alignment with Intel’s lines of business (LOBs) helps us shift our priorities to meet the growing demand from the Internet of Things Group (IoTG).http://brminstitute.org/
As an example of how our direct involvement with Intel’s LOBs shapes our work, consider the following graphic that shows the distribution of business requests from the various LOBs. In 2013, Intel’s Internet of Things Group (IoTG), represented by the dark blue block at the top of the left-hand graph, had very few requests for IT. But in 2014, the number of IoTG business requests grew significantly. Because we have a seat at the table, we were able to evolve with the business and meet the demands of this burgeoning sector of Intel’s market.
Through our close communication with the IoTG and early PoCs, we’ve deployed infrastructure based on the Intel® IoT Platform. We are leveraging that experience to help the group deliver solutions to Intel customers. This is just one example of how, through our BSI service, IT stays relevant and valuable to the entire enterprise.
I encourage you connect with me on the IT Peer Network and on Twitter @azmikephillips to share your thoughts and experiences relating to IT business relationship management and how it can metamorphose the role of IT from transactional to transformational.
OEMs and other customers use Intel’s system-on-a-chip (SoC) products in their mobile devices. Intel makes a variety of SoCs, and any one SoC includes many components, with processor, memory controller, graphics, and sound integrated on a single chip. Each of these components comes with its own documentation, and there’s even more documentation that describes how to integrate these components with other custom components designed by the OEM. Pretty soon, you have tens of thousands of pages of documentation.
But each Intel customer needs only a fraction of the total available documentation — a piece here and a piece there. They don’t want to read a 20,000-page document to find the three paragraphs they need.
Intel IT recently partnered with the Intel product group that helps Intel customers with mobile device design, to improve the delivery of content to customers.
Enter Stage Right: Topic-Based Content
Which would you rather use: a 500-page cookbook with general headings like “stove-top cooking” and “oven recipes,” or one with tabs for breakfast, lunch, and dinner, and cross-references and indexes that help you find casseroles, breads, stir frys, and crockpot recipes, as well as recipes that use a particular ingredient such as sour cream or eggs? Clearly, the latter would be easier to use because you can quickly find the recipes (topics) that interest you.
Darwin Information Typing Architecture, known as DITA (pronounced dit-uh), is an XML-based publishing standard defined and maintained by the OASIS DITA Technical Committee. DITA can help structure, develop, manage, and publish content, making it easier to find relevant information.
Four basic concepts underlie the DITA framework:
- • Topics. A topic is the basic content unit of DITA, defined as a unit of information that can be understood in isolation and used in multiple contexts. Topics address a single subject and are short and standardized to include defined elements, such as name, title, information type, and expected results.
- • DITA maps. DITA maps identify the products a topic is associated with and the target audience. All these things help determine which topics are included in search results. DITA maps also include navigational information, such as tables of contents.
- • Output formats. DITA-based content can be delivered in various formats, such as web, email, mobile, or print. For ease of use, the content’s final design and layout—its presentation—varies to accommodate the unique characteristics of each output format.
- • Dynamic content. Customers can select and combine different topics to create their own custom documents, which is sort of like being able to replace one piece of a DNA map to create a brand new animal.
(If DITA intrigues you, consider attending the 2015 Content Management Strategies/DITA North America conference in Chicago, April 20–22).
Intel’s Mobile Design Center Leverages DITA to Improve Our Customer’s User Experience
We designed a solution that eliminates the need for the previous long-form documentation. Instead, the solution enables SoC customers to assemble relevant content based on topics of interest. To achieve this, the Client Computing Group changed its documentation structure to topic-based content so that customers can quickly find highly specific information, enabling faster time to market for their mobile solutions and reducing the amount of time Intel engineers must spend helping customers find the information they need. The content is tagged with metadata so that customers can search on specific topics and bundle those topics into custom binders that they can reference or print as needed.
The Intel Mobile Design Center portal is described in detail in our paper, “Optimizing Mobile-Device Design with Targeted Content.” The portal’s ease of use contributed significantly to overall customer satisfaction with the solution. According to a survey we conducted, customer satisfaction scores have increased from 69 percent before implementation to 80 percent after.
Based on what the mobile communications group created in the Mobile Design Center, other groups are taking notice and creating their own design centers. For example, the Service Provider Division have committed to creating its own design center and are delivering all of its content in DITA to provide an even more interactive design for their customers.
Getting from Here to There
Converting existing FrameMaker and Word documents to DITA was not an easy undertaking. For the mobile communications group, some content wasn’t converted due to lack of time, although the group has committed to using DITA for all new content. This group performed the conversion manually, taking about 5 to 10 pages per hour. The entire conversion project took months.
For the second group we worked with, who converted their entire documentation set, the conversion was accomplished using several methods. For large FrameMaker docs, they used a third-party product to partially automate the conversion process. While the resulting DITA docs still needed manual touch-up, the automated conversion was a time-saver. For smaller FrameMaker documents, topics were created manually. For Word docs, topics were manually cut and pasted.
So, was the effort worth it? Both groups agree that indeed it was. First, conversion to DITA revealed that there was a lot of duplication between documents. When in the DITA format, revisions to a topic only take place in that topic — there is no need to search for every document that contains that topic. Not only does this reduce the time it takes to make revisions, but it also improves the quality of our documentation. In the past, without DITA, some documentation might be out-of-date because a topic was revised in one place but not in another.
“By converting to DITA we reduced the amount of content, allowing for reuse. This also reduced the amount of work for the authors,” said one team member. “DITA gives you a better feel of the makeup of your content,” said another.
Other team members touted improved revisions and version control and the ability to tag content by more than just document name.
What’s Next for DITA at Intel?
Because the solution we created is scalable, we anticipate that additional product and business groups across Intel will begin to take advantage of topic-based content to improve customer experience and Intel’s efficiency.
I’d love to hear how other enterprises are putting DITA to work for their customers, increasing customer satisfaction, encouraging dynamic content creation, and accelerating the pace of business. Feel free to share your comments and join the conversation at the IT Peer Network.
Business analytics and data insights empower today’s business leaders for faster decision making. A recent data consolidation and analytics project uplifted Intel’s revenue by $264 million in 2014, as highlighted in our recently published Annual Business Review. This $264 million represents only a portion of the $351 million in value generated by Intel IT through the use of big data, business intelligence, and analytic tools. Access to connected data in an efficient and timely manner has enabled stakeholders to analyze market trends and make faster and better business decisions.
The Right Data at the Right Time
Intel’s business processes use a significant amount of historical data to reach decisions. But isolated datasets are not very useful because they provide only a glimpse of a much larger picture. Recognizing the power of connected data, Intel IT engaged in an 18-month data cleansing and consolidation effort, connecting more than 200 GB of historical data from various disparate and vertical systems using common measures and dimensions.
The complexity of this project was daunting. There were many spreadsheets and applications, and even the same data had inconsistent identifiers in different datasets. Our efforts resulted in replacing more than 4,000 spreadsheets with a single database solution that included over 1,000 data measures and 12 dimensions, as well as tracking information for about 4 million production and engineering samples provided to customers.
Even connected data, however, is not inherently valuable, unless the data is conveyed in terms of trends and patterns that guide effective decision making. On top of our now-connected data, we added advanced analytics and data visualization capabilities that enable Intel’s decision makers to convert data into meaningful insights. About 9,000 application users that serve Intel and external customers have access to this data, along with 15,000 reporting users.
As part of the project, we automated our data management processes, so that we can now integrate new datasets in just a few hours, instead of in several months.
Boosting Sales with Reseller Market Insights
Another significant chunk of the previously mentioned $351 million — $76 million — was generated by a sales and marketing analytics engine that provides valuable information to Intel sales teams, helping them strategically focus their sales efforts to deliver greater revenue. The engine’s recommendations identify which customers sales reps should contact and what they should talk to them about. This data significantly shortened the sales cycle and enabled sales reps to reach customers who were previously off the radar. (Watch a video about the analytics engine here.) The fact that this recommendation engine garnered Intel a 2014 CIO 100 award illustrates how important CIOs consider technology in today’s business environment.
What’s Next for Data Visualization at Intel
Going forward, we intend to promote the collaborative analytics to Intel decision makers. For example, Intel IT has developed an Info Wall that harnesses the power of data visualization. This solution is built on Intel® architecture and is Intel’s first interactive video wall with a viewing area measuring 5 feet high and 15 feet wide. While it’s too early to state any specific results, this unique implementation will enable new possibilities for business intelligence and data visualization. Currently, the Info Wall and data focus on sales and marketing; we plan to soon expand the application of the Info Wall to other areas of Intel business.
In an age when organizations such as Intel are rich in data, finding value in this data lies in the ability to analyze it and efficiently derive actionable business intelligence. Intel IT will continue to invest in tools that can transform data into insights to help solve high-value business problems.
Security was a major area of focus at HIMSS 2015 in Chicago. From my observations, here are a few of the key takeaways from the many meetings, sessions, exhibits, and discussions in which I participated:
Top-of-Mind: Breaches are top-of-mind, especially cybercrime breaches such as those recently reported by Anthem and Premera. No healthcare organization wants to be the next headline, and incur the staggering business impact. Regulatory compliance is still important, but in most cases not currently the top concern.
Go Beyond: Regulatory compliance is necessary but not enough to sufficiently mitigate risk of breaches. To have a fighting chance at avoiding most breaches, and minimizing impact of breaches that do occur, healthcare organizations must go way beyond the minimum but sufficient for compliance with regulations.
Multiple Breaches: Cybercrime breaches are just one kind of breach. There are several others, for example:
- There are also breaches from loss or theft of mobile devices which, although often less impactful (because they often involve a subset rather than all patient records), do occur far more frequently than the cybercrime breaches that have hit the news headlines recently.
- Insider breach risks are way underappreciated, and saying they are not sufficiently mitigated would be a major understatement. This kind of breach involves a healthcare worker accidentally exposing sensitive patient information to unauthorized access. This occurs in practice if patient data is emailed in the clear, put unencrypted on a USB stick, posted to an insecure cloud, or sent via an unsecured file transfer app.
- Healthcare workers are increasingly empowered with mobile devices (personal, BYOD and corporate), apps, social media, wearables, Internet of Things, etc. These enable amazing new benefits in improving patient care, and also bring major new risks. Well intentioned healthcare workers, under time and cost pressure, have more and more rope to do wonderful things for improving care, but also inadvertently trip over with accidents that can lead to breaches. Annual “scroll to the bottom and click accept” security awareness training is often ineffective, and certainly insufficient.
- To improve effectiveness of security awareness training, healthcare organizations need to engage healthcare workers on an ongoing basis. Practical strategies I heard discussed at this year’s HIMSS include gamified spear phishing solutions to help organizations simulate spear phishing emails, and healthcare workers recognize and avoid them. Weekly or biweekly emails can be used to help workers understand recent healthcare security events such as breaches in peer organizations (“keeping it real” strategy), how they occurred, why it matters to the healthcare workers, the patients, and the healthcare organization, and how everyone can help.
- Ultimately any organization seeking achieve a reasonable security posture and sufficient breach risk mitigation must first successfully instill a culture of “security is everyone’s job”.
What questions do you have? What other security takeaways did you get from HIMSS?
This article is meant for those programmers who are only getting started with the Visual Studio environment and trying to compile their C++ projects under it. Everything looks strange and complicated… Read more
When I was assigned to give a lecture on “Pengembangan Software Pendidikan” (Edu-Software Development) this semester for Chemistry Education Students, I was challenged. I had no idea how to teach… Read more
April 19th, 1965 Gordon Moore introduced a fundamental way to view growth in technology later labeled “Moore’s Law”, where approximately every two years, the amount of transistors in a chip would double. This culminated into a layman’s explanation of … Read more >
Hi, I am trying to setup an Intel SCS server to deploy AMT profiles to HP Intel vPro PCs. In order to do this, I need to provision a Certificate, I got one from Comodo, but Intel SCS is asking for a CA Plugin couldn’t find this anywhereIs onl… Read more
The idea of precision medicine is simple: When it comes to medical treatment, one size does not necessarily fit all, so it’s important to consider each individual’s inherent variability when determining the most appropriate treatment. This approach makes sense, but until recently it has been very difficult to achieve in practice, primarily due to lack of data and insufficient technology. However, in a recent article in the New England Journal of Medicine, Dr. Francis Collins and Dr. Harold Varmus describe the President Obama’s new Precision Medicine Initiative, saying they believe the time is right for precision medicine. The way has been paved, the authors say, by several factors:
- The advent of important (and large) biological databases;
- The rise of powerful methods of generating high-resolution molecular and clinical data from each patient; and
- The availability of information technology adequate to the task of collecting and analyzing huge amounts of data to gain the insight necessary to formulate effective treatments for each individual’s illness.
The near-term focus of the Precision Medicine Initiative is cancer, for a variety of good reasons. Cancer is a disease of the genome, and so genomics must play a large role in precision medicine. Cancer genomics will drive precision medicine by characterizing the genetic alterations present in patients’ tumor DNA, and researchers have already seen significant success with associating these genomic variations with specific cancers and their treatments. The key to taking full advantage of genomics in precision medicine will be the use of state-of-the-art computing technology and software tools to synthesize, for each patient, genomic sequence data with the huge amount of contextual data (annotation) about genes, diseases, and therapies available, to derive real meaning from the data and produce the best possible outcomes for patients.
Big data and its associated techniques and technologies will continue to play an important role in the genomics of cancer and other diseases, as the volume of sequence data continues to rise exponentially along with the relevant annotation. As researchers at pharmaceutical companies, hospitals and contract research organizations make the high information processing demands of precision medicine more and more a part of their workflows, including next generation sequencing workflows, the need for high performance computing scalability will continue to grow. The ubiquity of genomics big data will also mean that very powerful computing technology will have to be made usable by life sciences researchers, who traditionally haven’t been responsible for directly using it.
Fortunately, researchers requiring fast analytics will benefit from a number of advances in information technology happening at just the right time. The open-source Apache Spark™ project gives researchers an extremely powerful analytics framework right out of the box. Spark builds on Hadoop® to deliver faster time to value to virtually anyone with some basic knowledge of databases and some scripting skills. ADAM, another open-source project, from UC Berkeley’s AMPLab, provides a set of data formats, APIs and a genomics processing engine that help researchers take special advantage of Spark for increased throughput. For researchers wanting to take advantage of the representational and analytical power of graphs in a scalable environment, one of Spark’s key libraries is GraphX. Graphs make it easy to associate individual gene variants with gene annotation, pathways, diseases, drugs and almost any other information imaginable.
At the same time, Cray has combined high performance analytics and supercomputing technologies into the Intel-based Cray®Urika-XA™ extreme analytics platform, an open, flexible and cost-effective platform for running Spark. The Urika-XA system comes preintegrated with Cloudera Hadoop and Apache Spark and optimized for the architecture to save time and management burden. The platform uses fast interconnects and an innovative memory-storage hierarchy to provide a compact and powerful solution for the compute-heavy, memory-centric analytics perfect for Hadoop and Spark.
Collins and Varmus envision more than 1 million Americans volunteering to participate in the Precision Medicine Initiative. That’s an enormous amount of data to be collected, synthesized and analyzed into the deep insights and knowledge required to dramatically improve patient outcomes. But the clock is ticking, and it’s good to know that technologies like Apache Spark and Cray’s Urika-XA system are there to help.
What questions do you have?
Ted Slater is a life sciences solutions architect at Cray Inc.
By Lisa Malloy, director of Policy Communications and Government Relations Today, Congress took a critical step in advancing U.S. global competitiveness with the introduction of Trade Promotion Authority (TPA) legislation. This is legislation our country sorely needs to maintain and accelerate … Read more >
I’m excited to announce that Intel has successfully closed the acquisition of Lantiq. This acquisition enables us to extend our success in cable home gateways into DSL and fiber markets giving us full coverage of broadband access methods around the world. … Read more >