April 19, 2015 marks the 50th anniversary of the publication of Moore’s Law — generally stated as the doubling of the number of device components on a silicon chip every two years. Most conversations on the subject tend to center around … Read more >
Recent Blog Posts
Ready or Not, Cross-Channel Shopping Is Here to Stay
Of all the marketplace transitions that have swept through the developed world’s retail industry over the last five to seven years, the most important is the behavioral shift to cross-channel shopping.
The story is told in these three data points1:
- 60 plus percent of U.S. shoppers (and a higher number in the U.K.) regularly begin their shopping journey online.
- Online ratings and reviews have the greatest impact on shopper purchasing decisions, above friends and family, and have four to five times greater impact than store associates.
- Nearly 90 percent of all retail revenue is carried out in the store.
Retail today is face-to-face with a shopper who’s squarely at the intersection of e-commerce, an ever-present smartphone, and an always-on connection to the Internet.
Few retailers are blind to the big behavioral shift. Most brands are responding with strategic omni-channel investments that seek to erase legacy channel lines between customer databases, inventories, vendor lists, and promotions.
Channel-centric organizations are being trimmed, scrubbed, or reshaped. There’s even a willingness — at least among some far-sighted brands — to deal head-on with the thorny challenge of revenue recognition.
All good. All necessary.
Redefining the Retail Space
But, as far as I can tell, only a handful of leaders are asking the deeper question: what, exactly, is the new definition of the store?
What is the definition of the store when the front door to the brand is increasingly online?
What is the definition of the store when shoppers know more than the associates, and when the answer to the question of how and why becomes — at the point of purchase — more important than what and how much?
What is the definition of the store beyond digital? Or of a mash-up of the virtual and physical?
What is the definition — not of brick-and-mortar and shelves and aisles and four-ways and displays — but of differentiating value delivery?
This is a topic we’re now exploring through whiteboard sessions and analyst and advisor discussions. We’re hard at work reviewing the crucial capabilities that will drive the 2018 cross-brand architecture.
Stay tuned. I’ll be sharing my hypotheses (and findings) as I forge ahead.
Global Director, Retail Sales
This is the second installment of the Tech in Retail series.
Click here to view: blog #3
To view more posts within the series click here: Tech & Finance Series
1 National Retail Federation. “2015 National Retail Federation Data.” 06 January 2015.
How did your desk change as a result of developing Internet of Things (IoT) projects and prototypes?
I’m sure that we’ve all once played the “spot the difference” game.
Sometimes there are just… Read more
Over the last several years, Intel IT has been implementing the Information Technology Infrastructure Library (ITIL) framework to transform our service delivery and enable us to align more effectively with the strategies and priorities of each of Intel’s lines of business (LOBs). In doing so, we can focus on high-priority activities that may potentially transform Intel’s entire business and boost the relevancy of IT. As the Chief of Staff for Product Development IT and the Director of Business Solutions Integration for Intel IT, I’m looking forward to meeting with others who have found the same value in using this practice or are considering starting that journey.
Intel IT at the Forefront of Business Relationship Management
From the top down, Intel IT fully understands the importance of business relationship management. In the last 18 months, we have transitioned from an organization loosely coupled to the business to one directly aligned with the business, literally sitting at the table to help make key business decisions.
—Vaughan Merlyn, co-founder of the Business Relationship Management Institute
In 2013, Intel’s CIO, Kim Stevenson, personally asked each LOB to include an IT general manager (GM) on their staff. This suggestion was met favorably by the LOBs, who saw tremendous value in connecting more formally and more closely with IT.
Intel IT has adopted a user-centered approach to delivering IT services that enables us to optimize our IT solutions, improve employee productivity, and increase business velocity. Our user-centered approach involves proactively engaging and partnering with Intel employees and business groups to learn about their needs for information, technology, and services, as well as desired experience. ITIL has been integral in placing the customer at the center, and our new Business Solutions Integration (BSI) service aligns with our user-centered IT strategy. It integrates business relationship management and business demand management, presenting the LOBs with a “One IT” view. Each LOB has a dedicated IT LOB GM, along with other dedicated IT staff that form that LOB’s core IT team: a business relationship manager, a principal engineer, and a finance controller.
“The day I’m representing Intel’s LOB more than my day job, I’ve arrived.”
—Intel IT Staff Member
With a single point of contact for IT, the LOBs can more easily request services. But more important, IT is attuned to the LOB’s strategies, priorities, and pain points. We’ve slashed the time it takes us to say “yes” or “no” to a business request from an average of 36 hours to 8 hours, and our level of support has improved dramatically, according to annual Partnership Excellence surveys.
Run, Grow, Transform
IT used to be thought of as the organization that kept the lights on and the business running, building tools when necessary. But here at Intel, while Intel IT does indeed keep the business running, our best value lies in proactively collaborating with our customers. Therefore, instead of only focusing exclusively on “Run” activities (such as providing network connectivity), we also actively pursue “Grow” and “Transform” activities.
In the “Grow” category, for example, we conduct proofs of concept (PoCs) and enterprise early adoption tests for emerging technologies. Even more valuable are our “Transform” activities, where we are directly involved in co-creating marketable products with our product groups and providing Intel with competitive advantage.
Our BSI service incorporates these higher-value activities through its integration with the IT2Intel program. I’ll explore each of these activities in more detail in future blogs. But briefly, our IT2Intel program enables us to accelerate Intel’s growth in enterprise markets by leveraging Intel IT’s expertise in partnership with Intel product groups.
Shifting with the Business
Our close alignment with Intel’s lines of business (LOBs) helps us shift our priorities to meet the growing demand from the Internet of Things Group (IoTG).http://brminstitute.org/
As an example of how our direct involvement with Intel’s LOBs shapes our work, consider the following graphic that shows the distribution of business requests from the various LOBs. In 2013, Intel’s Internet of Things Group (IoTG), represented by the dark blue block at the top of the left-hand graph, had very few requests for IT. But in 2014, the number of IoTG business requests grew significantly. Because we have a seat at the table, we were able to evolve with the business and meet the demands of this burgeoning sector of Intel’s market.
Through our close communication with the IoTG and early PoCs, we’ve deployed infrastructure based on the Intel® IoT Platform. We are leveraging that experience to help the group deliver solutions to Intel customers. This is just one example of how, through our BSI service, IT stays relevant and valuable to the entire enterprise.
I encourage you connect with me on the IT Peer Network and on Twitter @azmikephillips to share your thoughts and experiences relating to IT business relationship management and how it can metamorphose the role of IT from transactional to transformational.
OEMs and other customers use Intel’s system-on-a-chip (SoC) products in their mobile devices. Intel makes a variety of SoCs, and any one SoC includes many components, with processor, memory controller, graphics, and sound integrated on a single chip. Each of these components comes with its own documentation, and there’s even more documentation that describes how to integrate these components with other custom components designed by the OEM. Pretty soon, you have tens of thousands of pages of documentation.
But each Intel customer needs only a fraction of the total available documentation — a piece here and a piece there. They don’t want to read a 20,000-page document to find the three paragraphs they need.
Intel IT recently partnered with the Intel product group that helps Intel customers with mobile device design, to improve the delivery of content to customers.
Enter Stage Right: Topic-Based Content
Which would you rather use: a 500-page cookbook with general headings like “stove-top cooking” and “oven recipes,” or one with tabs for breakfast, lunch, and dinner, and cross-references and indexes that help you find casseroles, breads, stir frys, and crockpot recipes, as well as recipes that use a particular ingredient such as sour cream or eggs? Clearly, the latter would be easier to use because you can quickly find the recipes (topics) that interest you.
Darwin Information Typing Architecture, known as DITA (pronounced dit-uh), is an XML-based publishing standard defined and maintained by the OASIS DITA Technical Committee. DITA can help structure, develop, manage, and publish content, making it easier to find relevant information.
Four basic concepts underlie the DITA framework:
- • Topics. A topic is the basic content unit of DITA, defined as a unit of information that can be understood in isolation and used in multiple contexts. Topics address a single subject and are short and standardized to include defined elements, such as name, title, information type, and expected results.
- • DITA maps. DITA maps identify the products a topic is associated with and the target audience. All these things help determine which topics are included in search results. DITA maps also include navigational information, such as tables of contents.
- • Output formats. DITA-based content can be delivered in various formats, such as web, email, mobile, or print. For ease of use, the content’s final design and layout—its presentation—varies to accommodate the unique characteristics of each output format.
- • Dynamic content. Customers can select and combine different topics to create their own custom documents, which is sort of like being able to replace one piece of a DNA map to create a brand new animal.
(If DITA intrigues you, consider attending the 2015 Content Management Strategies/DITA North America conference in Chicago, April 20–22).
Intel’s Mobile Design Center Leverages DITA to Improve Our Customer’s User Experience
We designed a solution that eliminates the need for the previous long-form documentation. Instead, the solution enables SoC customers to assemble relevant content based on topics of interest. To achieve this, the Client Computing Group changed its documentation structure to topic-based content so that customers can quickly find highly specific information, enabling faster time to market for their mobile solutions and reducing the amount of time Intel engineers must spend helping customers find the information they need. The content is tagged with metadata so that customers can search on specific topics and bundle those topics into custom binders that they can reference or print as needed.
The Intel Mobile Design Center portal is described in detail in our paper, “Optimizing Mobile-Device Design with Targeted Content.” The portal’s ease of use contributed significantly to overall customer satisfaction with the solution. According to a survey we conducted, customer satisfaction scores have increased from 69 percent before implementation to 80 percent after.
Based on what the mobile communications group created in the Mobile Design Center, other groups are taking notice and creating their own design centers. For example, the Service Provider Division have committed to creating its own design center and are delivering all of its content in DITA to provide an even more interactive design for their customers.
Getting from Here to There
Converting existing FrameMaker and Word documents to DITA was not an easy undertaking. For the mobile communications group, some content wasn’t converted due to lack of time, although the group has committed to using DITA for all new content. This group performed the conversion manually, taking about 5 to 10 pages per hour. The entire conversion project took months.
For the second group we worked with, who converted their entire documentation set, the conversion was accomplished using several methods. For large FrameMaker docs, they used a third-party product to partially automate the conversion process. While the resulting DITA docs still needed manual touch-up, the automated conversion was a time-saver. For smaller FrameMaker documents, topics were created manually. For Word docs, topics were manually cut and pasted.
So, was the effort worth it? Both groups agree that indeed it was. First, conversion to DITA revealed that there was a lot of duplication between documents. When in the DITA format, revisions to a topic only take place in that topic — there is no need to search for every document that contains that topic. Not only does this reduce the time it takes to make revisions, but it also improves the quality of our documentation. In the past, without DITA, some documentation might be out-of-date because a topic was revised in one place but not in another.
“By converting to DITA we reduced the amount of content, allowing for reuse. This also reduced the amount of work for the authors,” said one team member. “DITA gives you a better feel of the makeup of your content,” said another.
Other team members touted improved revisions and version control and the ability to tag content by more than just document name.
What’s Next for DITA at Intel?
Because the solution we created is scalable, we anticipate that additional product and business groups across Intel will begin to take advantage of topic-based content to improve customer experience and Intel’s efficiency.
I’d love to hear how other enterprises are putting DITA to work for their customers, increasing customer satisfaction, encouraging dynamic content creation, and accelerating the pace of business. Feel free to share your comments and join the conversation at the IT Peer Network.
Business analytics and data insights empower today’s business leaders for faster decision making. A recent data consolidation and analytics project uplifted Intel’s revenue by $264 million in 2014, as highlighted in our recently published Annual Business Review. This $264 million represents only a portion of the $351 million in value generated by Intel IT through the use of big data, business intelligence, and analytic tools. Access to connected data in an efficient and timely manner has enabled stakeholders to analyze market trends and make faster and better business decisions.
The Right Data at the Right Time
Intel’s business processes use a significant amount of historical data to reach decisions. But isolated datasets are not very useful because they provide only a glimpse of a much larger picture. Recognizing the power of connected data, Intel IT engaged in an 18-month data cleansing and consolidation effort, connecting more than 200 GB of historical data from various disparate and vertical systems using common measures and dimensions.
The complexity of this project was daunting. There were many spreadsheets and applications, and even the same data had inconsistent identifiers in different datasets. Our efforts resulted in replacing more than 4,000 spreadsheets with a single database solution that included over 1,000 data measures and 12 dimensions, as well as tracking information for about 4 million production and engineering samples provided to customers.
Even connected data, however, is not inherently valuable, unless the data is conveyed in terms of trends and patterns that guide effective decision making. On top of our now-connected data, we added advanced analytics and data visualization capabilities that enable Intel’s decision makers to convert data into meaningful insights. About 9,000 application users that serve Intel and external customers have access to this data, along with 15,000 reporting users.
As part of the project, we automated our data management processes, so that we can now integrate new datasets in just a few hours, instead of in several months.
Boosting Sales with Reseller Market Insights
Another significant chunk of the previously mentioned $351 million — $76 million — was generated by a sales and marketing analytics engine that provides valuable information to Intel sales teams, helping them strategically focus their sales efforts to deliver greater revenue. The engine’s recommendations identify which customers sales reps should contact and what they should talk to them about. This data significantly shortened the sales cycle and enabled sales reps to reach customers who were previously off the radar. (Watch a video about the analytics engine here.) The fact that this recommendation engine garnered Intel a 2014 CIO 100 award illustrates how important CIOs consider technology in today’s business environment.
What’s Next for Data Visualization at Intel
Going forward, we intend to promote the collaborative analytics to Intel decision makers. For example, Intel IT has developed an Info Wall that harnesses the power of data visualization. This solution is built on Intel® architecture and is Intel’s first interactive video wall with a viewing area measuring 5 feet high and 15 feet wide. While it’s too early to state any specific results, this unique implementation will enable new possibilities for business intelligence and data visualization. Currently, the Info Wall and data focus on sales and marketing; we plan to soon expand the application of the Info Wall to other areas of Intel business.
In an age when organizations such as Intel are rich in data, finding value in this data lies in the ability to analyze it and efficiently derive actionable business intelligence. Intel IT will continue to invest in tools that can transform data into insights to help solve high-value business problems.
Security was a major area of focus at HIMSS 2015 in Chicago. From my observations, here are a few of the key takeaways from the many meetings, sessions, exhibits, and discussions in which I participated:
Top-of-Mind: Breaches are top-of-mind, especially cybercrime breaches such as those recently reported by Anthem and Premera. No healthcare organization wants to be the next headline, and incur the staggering business impact. Regulatory compliance is still important, but in most cases not currently the top concern.
Go Beyond: Regulatory compliance is necessary but not enough to sufficiently mitigate risk of breaches. To have a fighting chance at avoiding most breaches, and minimizing impact of breaches that do occur, healthcare organizations must go way beyond the minimum but sufficient for compliance with regulations.
Multiple Breaches: Cybercrime breaches are just one kind of breach. There are several others, for example:
- There are also breaches from loss or theft of mobile devices which, although often less impactful (because they often involve a subset rather than all patient records), do occur far more frequently than the cybercrime breaches that have hit the news headlines recently.
- Insider breach risks are way underappreciated, and saying they are not sufficiently mitigated would be a major understatement. This kind of breach involves a healthcare worker accidentally exposing sensitive patient information to unauthorized access. This occurs in practice if patient data is emailed in the clear, put unencrypted on a USB stick, posted to an insecure cloud, or sent via an unsecured file transfer app.
- Healthcare workers are increasingly empowered with mobile devices (personal, BYOD and corporate), apps, social media, wearables, Internet of Things, etc. These enable amazing new benefits in improving patient care, and also bring major new risks. Well intentioned healthcare workers, under time and cost pressure, have more and more rope to do wonderful things for improving care, but also inadvertently trip over with accidents that can lead to breaches. Annual “scroll to the bottom and click accept” security awareness training is often ineffective, and certainly insufficient.
- To improve effectiveness of security awareness training, healthcare organizations need to engage healthcare workers on an ongoing basis. Practical strategies I heard discussed at this year’s HIMSS include gamified spear phishing solutions to help organizations simulate spear phishing emails, and healthcare workers recognize and avoid them. Weekly or biweekly emails can be used to help workers understand recent healthcare security events such as breaches in peer organizations (“keeping it real” strategy), how they occurred, why it matters to the healthcare workers, the patients, and the healthcare organization, and how everyone can help.
- Ultimately any organization seeking achieve a reasonable security posture and sufficient breach risk mitigation must first successfully instill a culture of “security is everyone’s job”.
What questions do you have? What other security takeaways did you get from HIMSS?
This article is meant for those programmers who are only getting started with the Visual Studio environment and trying to compile their C++ projects under it. Everything looks strange and complicated… Read more
April 19th, 1965 Gordon Moore introduced a fundamental way to view growth in technology later labeled “Moore’s Law”, where approximately every two years, the amount of transistors in a chip would double. This culminated into a layman’s explanation of … Read more >
Hi, I am trying to setup an Intel SCS server to deploy AMT profiles to HP Intel vPro PCs. In order to do this, I need to provision a Certificate, I got one from Comodo, but Intel SCS is asking for a CA Plugin couldn’t find this anywhereIs onl… Read more
The idea of precision medicine is simple: When it comes to medical treatment, one size does not necessarily fit all, so it’s important to consider each individual’s inherent variability when determining the most appropriate treatment. This approach makes sense, but until recently it has been very difficult to achieve in practice, primarily due to lack of data and insufficient technology. However, in a recent article in the New England Journal of Medicine, Dr. Francis Collins and Dr. Harold Varmus describe the President Obama’s new Precision Medicine Initiative, saying they believe the time is right for precision medicine. The way has been paved, the authors say, by several factors:
- The advent of important (and large) biological databases;
- The rise of powerful methods of generating high-resolution molecular and clinical data from each patient; and
- The availability of information technology adequate to the task of collecting and analyzing huge amounts of data to gain the insight necessary to formulate effective treatments for each individual’s illness.
The near-term focus of the Precision Medicine Initiative is cancer, for a variety of good reasons. Cancer is a disease of the genome, and so genomics must play a large role in precision medicine. Cancer genomics will drive precision medicine by characterizing the genetic alterations present in patients’ tumor DNA, and researchers have already seen significant success with associating these genomic variations with specific cancers and their treatments. The key to taking full advantage of genomics in precision medicine will be the use of state-of-the-art computing technology and software tools to synthesize, for each patient, genomic sequence data with the huge amount of contextual data (annotation) about genes, diseases, and therapies available, to derive real meaning from the data and produce the best possible outcomes for patients.
Big data and its associated techniques and technologies will continue to play an important role in the genomics of cancer and other diseases, as the volume of sequence data continues to rise exponentially along with the relevant annotation. As researchers at pharmaceutical companies, hospitals and contract research organizations make the high information processing demands of precision medicine more and more a part of their workflows, including next generation sequencing workflows, the need for high performance computing scalability will continue to grow. The ubiquity of genomics big data will also mean that very powerful computing technology will have to be made usable by life sciences researchers, who traditionally haven’t been responsible for directly using it.
Fortunately, researchers requiring fast analytics will benefit from a number of advances in information technology happening at just the right time. The open-source Apache Spark™ project gives researchers an extremely powerful analytics framework right out of the box. Spark builds on Hadoop® to deliver faster time to value to virtually anyone with some basic knowledge of databases and some scripting skills. ADAM, another open-source project, from UC Berkeley’s AMPLab, provides a set of data formats, APIs and a genomics processing engine that help researchers take special advantage of Spark for increased throughput. For researchers wanting to take advantage of the representational and analytical power of graphs in a scalable environment, one of Spark’s key libraries is GraphX. Graphs make it easy to associate individual gene variants with gene annotation, pathways, diseases, drugs and almost any other information imaginable.
At the same time, Cray has combined high performance analytics and supercomputing technologies into the Intel-based Cray®Urika-XA™ extreme analytics platform, an open, flexible and cost-effective platform for running Spark. The Urika-XA system comes preintegrated with Cloudera Hadoop and Apache Spark and optimized for the architecture to save time and management burden. The platform uses fast interconnects and an innovative memory-storage hierarchy to provide a compact and powerful solution for the compute-heavy, memory-centric analytics perfect for Hadoop and Spark.
Collins and Varmus envision more than 1 million Americans volunteering to participate in the Precision Medicine Initiative. That’s an enormous amount of data to be collected, synthesized and analyzed into the deep insights and knowledge required to dramatically improve patient outcomes. But the clock is ticking, and it’s good to know that technologies like Apache Spark and Cray’s Urika-XA system are there to help.
What questions do you have?
Ted Slater is a life sciences solutions architect at Cray Inc.
By Lisa Malloy, director of Policy Communications and Government Relations Today, Congress took a critical step in advancing U.S. global competitiveness with the introduction of Trade Promotion Authority (TPA) legislation. This is legislation our country sorely needs to maintain and accelerate … Read more >
I’m excited to announce that Intel has successfully closed the acquisition of Lantiq. This acquisition enables us to extend our success in cable home gateways into DSL and fiber markets giving us full coverage of broadband access methods around the world. … Read more >
We had an Intel IoT roadshow at Bengaluru on 11th and 12th of April 2015 at MLR Convention center, Whitefield. We got an overwhelming response from IoT enthusiast to participate in this event…. Read more
Moore’s Law Wiki Page 50 Years of Moore’s Law on Slash Dot 66% of Americans do not know Moore’s Law Follow Gael on Twitter: @GaelHof 50 Years ago in April, Gordon Moore, one of the co-founders of Intel Corporation made a … Read more >
Editor’s Note: This blog is authored by Intel’s Sandra Lopez who is the Director of Strategic Alliances for Intel-based Wearables. There’s been a lot of attention lately on the topic of diversity: in society, in the workplace, in business, and—particularly … Read more >
LA Hacks returned to UCLA last weekend. 1400 developers from the UC and Cal Poly universities, nearby state and city colleges, and out of state schools made the journey to Pauly Pavilion and competed… Read more
Buying a car today is a lot of work, especially with so many new electronics options to learn about and consider. It’s stressful knowing that after driving off the lot, you’re stuck with what you bought with almost no chance … Read more >
The post How the Internet of Things Can Unlock the Door to a More Robust BMS appeared first on IoT@Intel.
Cyber attackers and researchers continually evolve, explore, and push the boundaries of finding vulnerabilities. Hacking hardware is the next step on that journey. It is important for computing device makers and the IoT industry to understand they are now under the microscope and attackers are a relentless and unforgiving crowd. Application and operating systems have taken the brunt of attacks and scrutiny over the years, but that may change as the world embraces new devices to enable and enrich our lives.
Vulnerabilities exist everywhere in the world’s technology landscape, but they are not equal and it can take greatly varying levels of effort, timing, luck, and resources to take advantage of them. Attackers tend to follow the path-of-least-resistance in alignment with their pursuit of nefarious goals. As security closes the easiest paths, attackers move on to the next available option. It is a chess game.
In the world of vulnerabilities there is a hierarchy, from easy to difficult to exploit and from trivial to severe in overall impact. Technically, hacking data is easiest, followed by applications, operating systems, firmware, and finally hardware. This is sometimes referred to as the ‘stack’ because it is how systems are architecturally layered.
The first three areas are software and are very portable and dynamic across systems, but subject to great scrutiny by most security controls. Trojans are a classic example where data becomes modified with malicious payloads and can be easily distributed across networks. Such manipulations are relatively exposed and easy to detect at many different points. Applications can be maliciously written or infected to act in unintended ways, but pervasive anti-malware is designed to protect against such attacks and are constantly watchful. Vulnerabilities in operating systems provide a means to hide from most security, open up a bounty of potential targets, and offer a much greater depth of control. Knowing the risks, OS vendors are constantly identifying problems and sending a regular stream of patches to shore up weaknesses, limiting the viability of continued exploitation by threats. It is not until we get to Firmware and Hardware, do most of the mature security controls drop away.
The firmware and hardware, residing beneath the software layers, tends to be more rigid and represents a significantly greater challenge to compromise and scale attacks. However, success at the lower levels means bypassing most detection and remediation security controls which live above, in the software. Hacking hardware is very rare and intricate, but not impossible. The level of difficulty tends to be a major deterrent while the ample opportunities and ease which exist in the software layers is more than enough to keep hackers comfortable in staying with easier exploits in pursuit of their objectives.
Some attackers are moving down the stack. They are the vanguard and blazing a path for others to follow. Their efforts, processes, and tools will be refined and reused by others. There are tradeoffs to attacks at any level. The easy vulnerabilities in data and applications yield much less benefits for attackers in the way of remaining undetected, persistence after actions are taken against them, and the overall level of control they can gain. Most security products, patches, and services have been created to detect, prevent, and evict software based attacks. They are insufficient at dealing with hardware or firmware compromises. Due to the difficulty and lack of obvious success, most vulnerability research doesn’t explore much in the firmware and hardware space. This is changing. It is only natural, attackers will seek to maneuver where security is not pervasive.
As investments in offensive cyber capabilities from nations, organized crime syndicates, and elite hackers-for-hire continue to grow, new areas such as IoT hardware, firmware, and embedded OS vulnerabilities will be explored and exploited.
Researchers targeting hardware are breaking new ground which others will follow, eventually leading to broad research in hardware vulnerabilities across computing products which influence our daily lives. This in turn will spur security to evolve in order to meet the new risks. So the chess game will continue. Hardware and firmware hacking is part of the natural evolution of cybersecurity and therefore a part of our future we must eventually deal with.
IT Peer Network: My Previous Posts
Embrace secure retail. Embrace mobile point-of-sale: As we are all aware this October the standards for securing retail transactions will shift to EMV in the US. The standard will define a new era in credit card purchases by making them more… Read more