ADVISOR DETAILS

RECENT BLOG POSTS

What Does CIO Reporting Structure Mean for IT at Large?

A previous manager of mine used to say that structure follows strategy. So it seems logical to conclude that a business’s organizational structure contains significant insights about – and implications for – the role of IT within that company.

 

Gone are the more traditional expectations of IT as a cost center, and along with it the expectation that the CIO would report directly to the CFO. With every new reporting structure that emerges, a new conversation of strategy and importance is started. For example, here are a few that I ran across on Twitter:

 

When the CIO reports to the CEO, IT has a chance at being a valuable part of the business.

— Scott W. Ambler (@scottwambler) September 18, 2014

 

#CIO reporting to the #CMO? It may be a hot trend but is the wrong strategic move! http://t.co/RxhartTnhx

— Jeffrey Fenter (@JeffreyFenter) March 9, 2014

 

If a CIO reports up into the CFO, the CFO must be willing to sacrifice finance risk to make systems risk the priority. Can that ever happen?

— Wes Miller (@getwired) September 26, 2014

With IT on its way to being seen as driver, enabler, and – most importantly – a partner of the business, it seems that the CIO’s natural evolution would be to report directly to the CEO. This relationship may solidify the business’s view of IT as a strategic differentiator – a segment of the business worthy of the CEO’s direct attention.

 

In a Gartner report released this past October, research showed that CIOs are already pulling up a prominent seat at the proverbial table, with 41% reporting directly to their CEO.

 

This made me wonder – who do the readers of the Intel IT Peer Network and followers of the Intel IT Center (LinkedIn, Twitter, Google+, Facebook) report to? So we created a poll to discover if this reporting trend extended to our community of IT leaders as well.

 

The results were interesting – the majority of our readers responded that their CIOs report directly to their CEO, while the traditional CIO/CFO model was cited as the second most common reporting structure.

 

CIO_Reporting_Poll_Results_kvuvcj.png

                                                                              

In order to continue to understand the landscape of reporting structure, I’ve left the poll open for further votes – let me know who your CIO reports to, and I’ll check in again in a few months.

 

Connect with me in the comments below or on Twitter (@chris_p_intel) – I’d love to know how you view organizational structure and its impact on IT (or vice versa).

 

Does who the CIO reports to imply anything about the importance of the role or is it simply a meaningless line on an org chart?

Read more >

BI Does Not Guarantee Better Decisions, Only Better-Informed Decisions

informed decisions with BIIn my blog, What is Business Intelligence? (BI), I talked about faster, better-informed decision making. I want to expand on these two key pieces. What does it mean when we say “faster” decision making? And why do we say “better-informed” decisions instead of “better decisions?”

 

Putting aside the semantic differences and nuances of meaning, these two concepts play a significant role in delivering BI solutions that can address both the urgency needed by business and the agility required by IT.

 

Moreover, exploring these concepts–regardless of your interpretation—will further facilitate better engagements and result in tangible outcomes that can benefit the entire organization, both in the short term and in the long run.

 

BI is all about speed when capitalizing on opportunities

 

Speed plays a more important role than ever before when capitalizing on opportunities, whether it contributes to growth or bottom line. Moreover, speed plays a role in every facet of business transactions—sometimes before even a transaction is completed—where business data is born or created. We no longer operate in the world of business PCs, which are chained at desks and accessed during bankers/working hours. Instead, mobility fuels global transactions that take place around the clock.

 

  • Speed dictates our options. For example, when the opportunity to enter a new market or adjust a marketing campaign variables presents itself, the need for insight grows exponentially as we consider our options to react while the clock is ticking. As questions are formulated both about the past and future, historical data provides only a starting point for decisions that will eventually impact our company’s future direction. This phenomenon doesn’t happen only occasionally or based on a fixed and predictable schedule, which would allow us to prepare our teams.
  • Business operations are modeled to match the pace of change even if our existing infrastructure isn’t equipped to handle the heavy load and sudden curves of the road. We often hear the words “uncertainty” and “risk” when executives talk about trying to make business decisions.
  • The questions we ask today aren’t the same ones we asked last week, nor are they the questions we’ll ask next week. We can no longer deliver business information (forget insight for a moment) using the traditional methods that may require longer periods of fertilization. Hence, “faster” demands speed and agility, both of which require not only ability but also accuracy.

 

The speed at which we gain insight is critical because it allows us to take advantage of the opportunity at full throttle. Agility is essential because most of these opportunities or challenges don’t RSVP before they show up at our door step. They are identified by talented individuals that move organizations forward.

 

Ability is what makes this whole thing feasible under pressure. Besides, how can we even talk about insight if we don’t have the data or can’t obtain it to begin with? Accuracy—even if it isn’t perfect—plays a vital part because many times we can’t afford unforced errors that would otherwise defeat the purpose of data-driven decision making.

 

BI can make us better-informed decision makers—but it does not necessarily make us smarter

 

With the exception of those automated business processes, such as online credit card applications, many critical business decisions are still made by humans (despite what many sci-fi movies portray). Whether we’re developing a business strategy or executing that strategy, leaders and managers still want to rely on insight derived from solid business data. Though there are many factors that play into the decision-making process, ultimately our goal must be to employ data-based analysis and to look at the evidence using critical thinking.

 

Data has to be solid, otherwise it becomes “garbage in/garbage out.” Do we have the single version of the truth? Do we trust the data? Do we ask the right questions? We need to be ready and willing to admit that we may be wrong about our assumptions or conclusion if we can identify flaws (supported by reliable data) in our initial assessment. We must be willing to play devil’s advocate. And maybe, we don’t blink but think twice when we can afford it. As the old saying goes, “measure twice and cut once.”

 

It doesn’t matter how we get there, data alone will not suffice—we know that. All of these variables will inevitably shape not only the final decision we make, but also the path we choose to arrive there. History is filled with examples of leaders making “bad” decisions even in light of ample amounts of data to support the decision making process.

 

Bottom Line

 

We may not be able to prevent all of the bad or flawed decisions, but we can promote a culture of data-driven decision making at all levels of our organization so that corporate data is seen as a strategic asset. Informed patients are able to make better-informed healthcare decisions. Informed consumers are able to make better-informed buying decisions. Likewise, BI should be a framework to enable “better-informed” decision making at all levels of an organization, while still allowing the final call to lie with us—the humans (at least for now).

 

Connect with me on Twitter (@KaanTurnali) and LinkedIn.

This story originally appeared on The Decision Factor.

Read more >

Mobile Allows Doctors to Answer, ‘How Did You Do This Week?’



Mobile devices and technology have allowed clinicians to gather patient data at the point-of-care, access vital information on the go, and untether from traditional wired health IT infrastructures. One hidden benefit of mobile capability is how doctors can gain access to data which analyzes their own performances.


In the video above, Jeff Zavaleta, MD, chief medical officer at Graphium Health and a practicing anesthesiologist in Dallas, shares his insight on how mobile devices offer a new opportunity for practitioners to self-evaluate, answer the question, “how did you do this week?,” and see key performance indicators such as their average patient recovery times and on-time appointment starts.

 

Watch the short video and let us know what questions you have about the future of mobile health IT and where you think it’s headed. How are you using mobile technology to improve your practice?

 

Also, be on the lookout for new blogs from Dr. Zavaleta, who will be a guest contributor to the Intel Health & Life Sciences Community.

 

Read more >

Checklist For Designing a New Server Room

Designing a new server room may initially seem to be a daunting task, there are after all, many factors and standards to consider. However, setting up the space and equipment doesn’t have to be an ordeal as long as you plan in advance and make sure you have all the necessary items. Here’s a checklist to facilitate the design of your data center.


Spatial Specifications

  • Room should have no windows.
  • Ceilings, doors and walls should be sound-proofed
  • Ceiling should be nine feet
  • Floor should be raised with anti-static surface

 

Equipment Specifications

 

  • Computer racks should have a clearance of at least 42 inches.
  • All racks should have proper grounding and seismic bracing.
  • Computing equipment should have a maximum electrical intensity of 300 watts per square foot.
  • Server room should contain fire, smoke, water and humidity monitors.

 

Cooling Specifications

 

  • Racks should be arranged in a hot-aisle/ cold-aisle configuration.
  • Under floor cooling systems require a raised floor with a minimum height of 24 inches, with the ability to hold the weight of server racks and equipment.


Electrical Systems Specifications

  • Computer equipment and HVAC should have separate power panels.
  • There should be no heat-generating support equipment.
  • Electrical systems should have an isolated ground, grounding grid and dedicated neutral.
  • Separate back-up power should be available for data center.
  • The electrical system should have a shunt trip for purposes of emergency shutdown.

 

Data Center Resources has had a reputation for providing superior data center solutions since 2002. Our dedicated team understands that while the solution is important, it is only a part of the overall relationship with our clients. Responsiveness, after sale service, ease of purchasing and breadth of product offerings are other important factors, and we are committed to exceeding expectations in all of these areas. Our principals and project specialists each have several years of experience in providing technical environment solutions.  Contact our team today to find out how we can help you design a new server room.

Read more >

Accelerating the Transformation to Software Defined Storage

Arif2.PNGWe are all aware of the explosion of data use to drive social media, video streaming and other content-rich applications. Most of it needs to be stored somewhere.  And many of these new applications and devices using this data consume more resources, requiring them to be deployed more rapidly than traditional back office enterprise applications.

 

Six months is now too long of a waiting period to deploy a new application or service. Today, IT organizations need to rapidly move in a matter of hours. This means the storage environment needs to be flexible, scalable and responsive. In other words, data centers need to be more cloud-ready or software-defined, presenting numerous challenges to the traditional storage environment.

 

In order for data centers to transform into scalable, flexible environments, storage must evolve from purpose-built and dedicated silos that are designed for specific applications to large and general pools which are dynamically allocated and controlled via the software-defined storage control layer.

 

Intel says its vision of a software-defined storage (SDS) is a framework that enables dynamic, policy-driven management of storage resources – a world where the application defines the storage.  “Intel is helping to accelerate this storage transformation with processor innovations, non-volatile memory, and networking and fabric products optimized for storage workloads,” says Andreas Schneider, Intel EMEA Storage Product Marketing Manager.

 

“We are actively engaged with partners in the ecosystem to deliver optimized SDS solutions that take advantage of these latest technologies and have reference architectures, or ‘recipes’, available as well.”

For example, Intel® Storage Acceleration Library provides optimized algorithms that streamline the path through the processor, improving performance of storage functions like deduplication, compression and erasure coding. These libraries are available to storage OEMs and ISVs and contributed to the open source community.

 

Fujitsu recently announced a new hyperscale SDS appliance based on Intel technologies and Ceph open source software. The appliance is ideal for cost-sensitive users who need instant online access to large data volumes. Fujitsu and Intel collaborated to integrate Intel processors, SSDs, and software technologies such as Virtual Storage Manager (VSM). VSM is an open source software tool which simplifies Ceph cluster setup and management.

 

To continue moving forward with the SDS vision, we need the SDS controller, which has the visibility and control of all storage resources, as well as communication between applications, orchestrator and storage systems. This SDS controller needs to be based on open source standards to allow for interoperability across hardware platforms.

 

Schneidersays, “We are working with the community to develop an open SDS controller prototype to help demonstrate the concept and validate the value proposition. As part of this effort, Intel is looking to partner with the OpenStack community to help solve key storage challenges.”

 

Where are you on the path to SDS? Does the requirement for the SDS controller resonate with you?

 

- Arif

Read more >

More Secure Programming – Where to Begin with Static Code Analysis

Code.jpgIdentifying and correcting security vulnerabilities in applications has become more increasingly vital with Static Code Analysis tools in conjunction with manual code reviews. Static Code Analysis includes an automated software tool that examines a program’s source code without actually executing it. This type of analysis is used to identify different kinds of security issues, obscure logic problems, bugs and defects, and more. Even more importantly, it is becoming common to have an organizational policy that includes the requirement. It is already a compliance requirement for organizations that must comply with Payment Application Data Security Standard (PCI PADSS).

 

There are a plethora of vendors with static code analysis tools that we won’t be comparing here but rest assured the most common development languages are supported. These tools can be very helpful in determining adherence to secure coding standards. But one of the biggest challenges to getting started is the shock of a report after an initial codebase is analyzed. There could be tens of thousands of issues found when an analysis is completed for a large codebase that has never been scanned before. Going through the static code analysis report can be beneficial in helping to identify high risk security areas but can also be time consuming to research what may result in false alarms. Either way, the effort must be made to review such a report as it helps demonstrate due diligence by documenting the review of potential vulnerabilities. For the software engineer being asked to address issues found in a large legacy code base, it can present more stress added on to the workload for developing the next release.

 

If a threat model was completed during the design phase of the application development, it can help to describe the security objectives or privacy requirements for the application and how those objectives mitigate threats in possible misuse or abuse cases. The main focus should have been on protecting the system and the information being processed. Furthermore, an attack surface analysis helps with defining how an external adversary may attempt to attack the application and focuses more on the high risk areas where there may be more exposure such as Internet connected interfaces. If these tools were not used during the development phase, maybe other types of risk based approaches provided the same result. But if not, it’s advisable to start having these conversations with all stakeholders so that the security objectives and attack surface mitigations can be well defined. It’s likely that an Advance Persistent Threat (APT), albeit with limited knowledge of the system, would use similar tools when attempting to identify an application’s potential weaknesses for the purpose of exploiting them.

 

Requirement for static code analysis has become more commonly integrated into an organizations secure application development processes and it helps with adherence to ISO 27034. It’s also advisable to integrate Threat Modeling and Attack Surface Analysis into the lifecycle as well. These tools are helpful in prioritization efforts so that identified issues in static code analysis reports can be focused on the most important security features of an application first. This will undoubtedly help the security reviewer gain traction on an effort that may seem overwhelming at first.

 

 

Find Andy on LinkedIn

See previous content from Andy_Good 

Start a conversation with Andy on Twitter

 

 

 

Read more >

Safeguarding the Smart City of the Future

This is the second installment of a four part series on Smart Cities with Dawn Olsen (#2 of 4).

Click here to read blog #1

Click here to read blog #3


Dawn5.jpgIn the age of smart cities, it’s an unavoidable truth that although technology can deliver new and amazing capabilities, it can also potentially be a disruptive force. That’s why security is a major focus within the development of Internet of Things.  It is important to use integrated technologies to address some of the security challenges that can emerge in connected cities.


While social media has evolved the experience of events (both big and small) by virtually connecting people, new potential threats have surfaced as a result. When the world’s eyes are on major events like,  the 2012 Olympic Games in London or the 2014 World Cups in Rio de Janeiro, protest groups may try to make the most of this global attention.


It’s not just major world events that are susceptible to such disturbances.  An unfortunate Dutch teenager found this out when their not-so-sweet 16th birthday party crashed by 3,000 people after the event announcement went viral on Facebook. This small event invitation spread wider and wider on social media, resulting in large crowds gathering and eventually police arriving to the scene in riot gear.  Cases like this are rare, but I mention it because city authorities and security services have learned lessons from it. Intel is working with authorities to safeguard against threats like these. We’re bringing together new tools, connected to the Internet of Things. From the devices in our hands and the sensors on our streets to the gateways, servers and cloud-based management platforms used to orchestrate complex security networks, innovative tech is developing on behalf of city security It’s certainly a far cry from traditional measures like simple CCTV networks. By combining inputs from different sources and various types of tech acquisitions in real time, Intel is pioneering “joined-up” security solutions. As mentioned, social media can be part of the solution as well as the challenge.  By gathering data from public networks like Twitter and FourSquare, police forces can now track situations as they unfold. And support for these measures exists: an Intel survey found that 61 percent of Americans believe it’s worthwhile for a connected city to gather anonymous information about people – that is, if the data is then used to benefit the area.


Dawn3.pngIn this way, authorities in command centers now have access to this up-to-date intelligence.  The ability to use Big Data analytics can be used to predict where trouble may occur so that preventative measures can be taken ahead of time. The technology is already showing results in pilots around the world. In one city in Europe, for example, Intel is working with local authorities to police a 400 meter strip with around 50 bars that attract20,000+ visitors each weekend. Using a system that integrates social media monitoring, light sensors, and sound-monitoring technologies, Intel is harnessing the full IoT ecosystem to help the city improve safety, reduce security and lower maintenance costs and turn a thriving event into an even more economically and viable phenomenon for the city.


As smart cities grow, security remains the number one priority for city leaders. By supplying the necessary tools and the expertise, Intel is helping these cities harness the Internet of Things to keep city dwellers safe.


This is the second installment of a four part series on Smart Cities (#2 of 4).  Click here to read blog #1.


To continue the conversation, let’s connect on Twitter @DawnOlsen


Dawn Olsen

Global Sales Director

Government Enterprise, Intel

Read more >

Cutting Congestion in Our Smart Cities

This is the third installment of a four part series on Smart Cities with Dawn Olsen (#3 of 4). 

Click here to read blog #1

Click here to read blog #2


One of the most visible results of a growing urban population is an increased congestion of traffic jams.  This affects not just our ability to go about our business, but also the quality of air that we breathe.  In my next post in this blog series, I will discuss Intel’s work to help improve air quality.  Today, I’d like to focus on smart transport.


Dawn5.jpgMore than half of the world’s population lives in cities, and the number is growing.  It’s predicted that by 2025, 37 so-called megacities around the world will have a population of more than 10 million. As the number of city dwellers continues to rise, so does the number of vehicles.  It is this reason that congestion is becoming a bigger issue in our cities that are ill-prepared for such volume growth.  For example, the journey to work can now last hours in Sao Paolo some commuters are resorting to helicopters to get around


Cities are beginning to take action to tackle this growing congestion problem.  With the ever-evolving innovations in technology, new opportunities and solutions are presented all the time. For instance, by combining existing infrastructure with smart sensors and Intel® Gateway Solutions for the Internet of Things (IoT), a city’s roadway system can be completely transformed – traffic control centers can gather useful data and get a view of how the whole transport system is working in order to improve efficiency.    


While traffic jams can cause inconveniences for everyday commuters, it can also cause critical delays for emergency response teams.  Intel is working with city partners to develop an end-to-end smart transport system that uses real-time data to calculate the fastest route for emergency services.  Through this system, vehicle-to-vehicle communications can be connected to sensors that monitor congestion levels.  With this integration, alert messages can be triggered to update a central management system. The “Intel Freeway to the Future” survey found that, if a smart transport system such as this was built, 59% of Americans would opt-in to a city program to have a sensor fitted in their car.


Planning sustainable traffic systems is an important part of the long-term initiatives Intel has developed within cities like San Jose, California. By bringing together the technological building blocks that make up the Internet of Things, Intel is helping the city improve infrastructure.  The aim in the next 40 years is to reduce the total miles travelled in the city by 40 percent while creating 25,000 new clean-tech jobs. This project has been recognized by the White House as an integral part of the Smart America initiative.  Other public-private collaborations involving smart cities and Intel – like the ongoing project in Dublin, Ireland – have also been noticed.


Dawn6.jpgThere are countless potential applications for this technology that can enhance the lives of city people.  One possibility could include an intelligent routing system for school buses so that parents can track the bus to ensure that their kids are traveling safely and on schedule.  Other opportunities exist within the idea of building a smart transport system that connects different services.  Suppose a ferry is running late, this smart transport system could alert local buses to arrive later for a smooth travelling connection.


For those of us who can’t afford helicopter rides to and from work, Intel’s smart city initiatives are driving sustainable, effective solutions to meet the challenges of rising congestion levels in the city.


This is the third installment of a four part series on Smart Cities (#3 of 4).

Click here to read blog #1

Click here to read blog #2


To continue the conversation, let’s connect on Twitter @DawnOlsen


Dawn Olsen

Global Sales Director

Government Enterprise, Intel

Read more >

mHealth: Connecting Consumers with Care

The buzz following the mHealth Summit has been encouraging, to say the least. The December event drew 4,000 attendees, who were brought up to speed on the latest developments spanning policies and research, global health, hospital mobility, consumer engagement, privacy and security and, of course, emerging technologies.

 

The two areas of focus that I found most encouraging centered on consumer engagement and care coordination.

 

Far too often, when the industry talks about mobile health, the technology itself – or even just the promise of an emerging technology – has a way of quickly overpowering the dialogue. But as the Center for Connected Medicine’s Joseph Kvedar touched on (and several panels advanced the notion), one of the biggest issues facing healthcare right now is getting and keeping consumers interested in their own care. The success of mobile devices and apps, as well as early consumer interest in wearables, is encouraging because it shows that all the pieces are in place. But until consumers show as much interest in communicating their health information with their doctors as they do, say, sharing Facebook posts, the healthcare system overall will continue to struggle.

 

Given this present state of consumer engagement, news that care coordination works was all the more welcome.

 

As mHealthNews reported: “In health systems large and small, clinicians are using smartphones to instantly connect with others caring for the same patient. They’re sharing notes and tests, discussing treatment plans and, in many cases, bringing the patient and his/her family into the loop to map out a care plan that goes beyond the hospital or clinic. It’s a tried-and-true process that’s gone beyond the pilot stage, as was noted in Healthcare IT News’ Monday morning breakfast panel and several educational sessions. Expect this to become the norm for patient care.”

 

Taken together, the growing emphasis on consumer engagement – coupled with the now proven advantages of care coordination to help overcome the disconnect between physicians and other care givers – is, in my opinion, highly likely to yield meaningful outcomes.

 

Equally important, as medical groups and health systems begin to make headway with consumer engagement while addressing care coordination holistically, providers should be able to work together to keep patients healthier – while remaining competitive in the marketplace.

 

What questions about mHealth do you have?

 

As a B2B journalist, John Farrell has covered healthcare IT since 1997 and is a sponsored correspondent for Intel Health & Life Sciences.

Read John’s other blog posts

Read more >

Creating a Cloud Perspective: Re-envisioning your Revolutionary Data Center

This is the second installment of a five part series on Tech & Finance.  Click here to read blog #1.

 

Blalock 1.jpgAs I explained in my previous blog post, we are in the midst of a Third Industrial Revolution, which is driving huge changes in the way we live and work.


Enabled by the technological rise of computing, communications and the Internet, this latest disruptive period presents a range of challenges and opportunities for financial service organizations. Today, we’re going to delve into one of them: The Cloud.


Just as the rest of the world is riding a third wave of innovation, so is the data center.  First, we had an era of servers dedicated to a particular workload or a specific department.  This gave way to virtualized servers – which allowed infrastructures to be shared and hardware or software to be decoupled.  Subsequently, this pushed us into what we presently see today - a major focus on rapid, agile and efficient service delivery through the cloud.


The impact of this revolution can clearly be felt outside the server room. Business leaders now want their information and services to be delivered as quickly as possible. Getting results in a timespan of months is no longer an option; it’s got to be done in minutes. Unfortunately this is easier said than done. For example, I attended the SIBOS 2014 conference in Boston in September, where I learned that it’s estimated that about 85% of a bank’s resources are focused on running the bank, while only 15% is used on changing and innovating the bank. Wouldn’t it be nice if that ratio was the other way around?


If you want to achieve that re-balancing act, you need to look at minimizing time, cost and labor.  Simultaneously, agility, automation and elasticity needs to increase across the business. Cloud can help with this – whether it’s in public, private or hybrid form – it can truly accelerate your digital transformation by shifting investments to where it’s needed most.


Blalock2.jpgIn the digital service economy, a key limiter for many financial institutions is their data center infrastructure. Years of having each business unit or product team manage its own sales, marketing, even P&L, can often result in highly fragmented IT environments and data repositories, not to mention duplicated effort for employees and a confusing experience for the customer. But cloud can enable digital transformation by removing legacy architecture and IT silos within the bank, in turn boosting operational efficiency, customer engagement, compliance and risk management.


I’m excited by the work that Intel is doing to meet the challenges of our customers in the financial service industry. Our strategy of re-envisioning the data center revolves around Software Defined Infrastructure (SDI). Essentially this means that users have control over a pool (or pools) of resources that may include server, network, and storage – all managed by orchestration software that uses whichever hardware is available. This approach is simplified, elastic, and built on open industry standards. We believe it will have a highly disruptive effect on the way enterprises think about defining, building, running and managing data centers.


The cloud is just one element of a successful revolutionary enterprise though. In my next post, I’ll be looking at how financial services should use analytics to pull deeper insights from their data resources.


This is the second installment of a five part series on Tech & Finance.  Click here to read blog #1.


Let’s continue the conversation on Twitter: @blalockm


Mike Blalock

Global Sales Director

Financial Services Industry, Intel

Read more >

How Intel QuickAssist Technology Helps Storage and Big Data Deal With Explosive Data Growth

I recently got the opportunity to discuss the security and network optimization applications of Intel QuickAssist Technology with Allyson Klein for her “Chip Chat” podcast. I enjoy listening to Chip Chat, so it was a great experience to be a part of the podcast.

 

The interview was part of our launch of the new Intel® Xeon® Processor E5-2600 v3 and Intel Communications Chipset 8900 featuring QuickAssist Technology.

 

QuickAssist Technology provides hardware-assisted compression and cryptography for Xeon-based platforms that allows system manufacturers to implement real time compression and encryption algorithms with minimal utilization of the CPU.  Thus, they get the flexible, high performance encryption and compression while preserving processing cores for revenue-generating applications.

 

On the podcast, I spoke a lot about the security benefits of this technology, ranging from better performance on the ciphers that protect data transmission in 3G and 4G/LTE networks to the use of secure sockets layer (SSL) encryption on a growing range of websites and web services. QuickAssist technology provides performance boost and increased efficiency for a wide range of these applications.

 

But the podcast lasts only 12 minutes and what I didn’t get to discuss was the growing need for the compression capabilities of QuickAssist Technology in storage and big data analytics applications — We are living through a time of dramatic growth in data. IDC reports that the data that is created and copied will double annually reaching 44 zettabytes (equivalent to 44 trillion gigabytes) by 2020.**  Where this hits particularly hard is with big data and storage applications and QuickAssist Technology allows system manufacturers to implement real time compression and encryption algorithms that can keep up with network and CPU performance.

 

Let’s look at how QuickAssist is being used in storage and big data applications:

 

Storage: Network performance and processor performance have increased dramatically in recent years, but hard disk drive (HDD) read/write performance has not kept pace. Thus more data centers are tiering their storage systems to put their most active data on the solid-state drives (SSDs) and less active on HDDs and using compression to get more space at every tier. Using Intel QuickAssist provides the computing power for to compress data in real-time, making it realistic to use for even the highest-performance storage tiers.

 

Big Data: 2015 is predicted by many to be a big year for big data, driven by the success of Hadoop, the leading big data framework. Many enterprises are running trials with Hadoop to garner useful analytics from their big data and will convert those trials into production systems in the coming year. Early big data projects focused on batch processing of data at rest, but now capabilities are evolving that enable processing and analyzing of streaming data in real-time. This evolution is unlocking a variety of exciting new use cases and applications in healthcare, telecom and capital markets.

 

Big data applications need compression for a couple key reasons: Hadoop is designed to run on large data sets residing in compute and storage clusters.  Those clusters rely on compression to preserve network bandwidth and storage capacity, and in turn optimize network utilization and system cost. Real time data compression delivers on these objectives, while also improving the run time of end-to-end Hadoop processing. Intel QuickAssist technology built into the storage devices and Hadoop servers will help the next generation of full scale deployments to succeed.

 

Compression and encryption are such fundamental processes for some of the key security and data growth challenges of our digital universe.  It’s exciting to see the new use cases for these processes that are enabled when QuickAssist Technology is used to drive new levels of performance.

 

 

What do you think? Find me on Twitter @jeni_p and let me know.

 

Read more >

Safeguarding the Smart City of the Future

This is the second installment of a four part series on Smart Cities (#2 of 4).  Click here to read blog #1.



Dawn5.jpgIn the age of smart cities, it’s an unavoidable truth that although technology can deliver new and amazing capabilities, it can also potentially be a disruptive force. That’s why security is a major focus within the development of Internet of Things.  It is important to use integrated technologies to address some of the security challenges that can emerge in connected cities.


While social media has evolved the experience of events (both big and small) by virtually connecting people, new potential threats have surfaced as a result. When the world’s eyes are on major events like,  the 2012 Olympic Games in London or the 2014 World Cups in Rio de Janeiro, protest groups may try to make the most of this global attention.


It’s not just major world events that are susceptible to such disturbances.  An unfortunate Dutch teenager found this out when their not-so-sweet 16th birthday party crashed by 3,000 people after the event announcement went viral on Facebook. This small event invitation spread wider and wider on social media, resulting in large crowds gathering and eventually police arriving to the scene in riot gear.  Cases like this are rare, but I mention it because city authorities and security services have learned lessons from it. Intel is working with authorities to safeguard against threats like these. We’re bringing together new tools, connected to the Internet of Things. From the devices in our hands and the sensors on our streets to the gateways, servers and cloud-based management platforms used to orchestrate complex security networks, innovative tech is developing on behalf of city security It’s certainly a far cry from traditional measures like simple CCTV networks. By combining inputs from different sources and various types of tech acquisitions in real time, Intel is pioneering “joined-up” security solutions. As mentioned, social media can be part of the solution as well as the challenge.  By gathering data from public networks like Twitter and FourSquare, police forces can now track situations as they unfold. And support for these measures exists: an Intel survey found that 61 percent of Americans believe it’s worthwhile for a connected city to gather anonymous information about people – that is, if the data is then used to benefit the area.


Dawn3.pngIn this way, authorities in command centers now have access to this up-to-date intelligence.  The ability to use Big Data analytics can be used to predict where trouble may occur so that preventative measures can be taken ahead of time. The technology is already showing results in pilots around the world. In one city in Europe, for example, Intel is working with local authorities to police a 400 meter strip with around 50 bars that attract20,000+ visitors each weekend. Using a system that integrates social media monitoring, light sensors, and sound-monitoring technologies, Intel is harnessing the full IoT ecosystem to help the city improve safety, reduce security and lower maintenance costs and turn a thriving event into an even more economically and viable phenomenon for the city.


As smart cities grow, security remains the number one priority for city leaders. By supplying the necessary tools and the expertise, Intel is helping these cities harness the Internet of Things to keep city dwellers safe.


This is the second installment of a four part series on Smart Cities (#2 of 4).  Click here to read blog #1.


To continue the conversation, let’s connect on Twitter @DawnOlsen


Dawn Olsen

Global Sales Director

Government Enterprise, Intel

Read more >

Is Your Healthcare Security Friendly?

As recently as 10 years ago, healthcare IT was mostly corporate provisioned, less diverse and there were slower refresh rates. Up to this point, usability was treated as a “nice to have” and significantly lower priority than features or functionality of solutions.

 

In this more homogeneous and slower changing environment there was, for the most part, one way to get the job done. Fast forward to today where most healthcare IT environments are much more heterogeneous with a myriad of devices, both corporate and personal BYOD (Bring Your Own Device), operating systems, apps, versions, social media, and now we have wearables and Internet of Things rapidly growing. Furthermore, refresh rates are much faster, especially with personal / BYOD devices, and apps. In today’s environment, usability is very much a “must have” because if it is not present research shows healthcare workers find workarounds, like using personal devices, and these workarounds drive non-compliance and additional security and privacy risk and can often be the source of breaches.

 

Traditionally we have approached usability and security as a tug of war or tradeoff … where having more security meant less usability and vice versa.

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

Unfortunately, breaches have reached alarming levels in both business impact and likelihood. The total average cost of a data breach in 2014 was US $3.5 million. This average is global and across several industries including healthcare. Looking more specifically at healthcare, the global average cost of a data breach per patient is US $359, the highest across all industries. With this kind of cost avoiding, breaches are of paramount importance for healthcare organizations. But how can we add security without compromising usability, and inadvertently driving workarounds that actually cause non-compliance and risk?

 

What is desperately needed is security that preserves or even improves usability, where risks are significantly mitigated without driving healthcare workers to use workarounds. On the surface this may seem impossible, yet there are several security safeguards today that do just that. Many breaches occur due to loss or theft of mobile devices. A very good safeguard to help mitigate this risk are self-encrypting SSD’s (Solid State Drives). If one takes a conventional hard drive, unencrypted and at risk of causing breach if lost or stolen, and replaces it with an SSD + encryption this can often have better data access performance than the original conventional unencrypted drive. Another example of a safeguard that improves usability and security is MFA (Multi-Factor Authentication) combined with SSO (Single Sign On), which improves both the usability and security of each login, as well as reduces the overall number of logins.

 

Intel Security Group is focused on creating innovative security safeguards that combine security software vertically integrated with security hardware that improve usability and harden the overall solution to make it more resilient to increasingly sophisticated attacks, such as from cybercrime. With cloud and mobile, and health information exchange, security becomes like a chain, and effective security requires securing all points and avoiding weakest links. Intel Security Group solutions span right from mobile devices, through networks to backend servers. This paves the way for healthcare to adopt, embrace, and realize benefits of new technologies while managing risk, and improving usability.

 

What questions about healthcare IT security do you have?

 

For more information on Healthcare Friendly Security see this new whitepaper.

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel Corporation and a frequent blog contributor.


Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Read more >

IT’s Year in Review

New Year’s Day 2015


As I sat watching the Rose Bowl (not that I am a big follower of college football, I’m more of a pro football/college basketball kind of fan), but it is an annual keyboard-2-482675-m.jpgtradition to watch the Pac 10 battle the Big 10, oh, wait, someone already screwed up THAT tradition. Oh, well, it’s not like IU ever played it in anyway. Anyway, as Oregon began to dominate (ok, right after the National Anthem), I thought it would be interesting to look back over my tweets of 2014 and see what insights into IT were captured by what I had shared. I was struck by the breadth of knowledge, experience, and subjects that had been covered. I could not think of a better place to share some of these posts than on Intel’s IT Peer Network. Twitter_bird_logo_2012.svg.png


A couple of things before we dive in. First, many (but certainly not all) of these posts came via the Top 100 CIOs to Follow on Twitter by Vala Afshar. If you are not following these folks and you are in IT, you should be! Second, to further set the stage think back to the start of 2014 by checking out the projected trends for the year in Top Ten Strategic Technology Trends for 2014 from Gartner.


So without any further ado, and in no particular order…

Traditional vs. Digital CIOs: Survey Reveals a Growing Divide


IT’s Losing Battle Against Cloud Adoption


5 Lessons for CIOs in the Age of Cloud


Coaching for Corporate Creatives


The Cloud-fueled Disruption of Business Analytics


The Four Personas of the Next-Generation CIO


The Consumer Revolution of Enterprise Computing


How Gamification Drives Business Objectives


3 Ways Emerging Technologies are Changing the Way CIOs Do Business


What is Your Company’s Vision and Why Should You Care?


New IT Challenge: Balancing Agility and Discipline


IT has Finally Cracked the C-Suite

Can Big Data be the Next Big Economic Indicator?


CIOs Must Learn to Dance with other C-Suite Executives


IDC Chief Research Officer: CIOs Must Embrace The Third Platform


IT Spend is Growing but CIOs Just Don’t Get It


The Time to Think about Middlemen is Before there is Only One


The Strategic CIO: Are You and Your Team Ready


What is Mobile Business Intelligence?


2014 – The Year the CIO and CMO Partnership Became Mission Critical


That last post is from Isaac Sacolick, as I created my original list, I had over a half dozen posts from Isaac. If you follow one blog in 2015 you would do well to follow Isaac’s Social, Agile and Transformation…(ok, in addition to this one (“The CIO is Dead, Long Live the CIO”), Rivers of Thought and my LinkedIn blog).


So that week between the Playoffs and the Super Bowl, when there is nothing else to do except watch the thermometer drop;  why not spend some time clicking through these blog posts? Trust me, you will come away with more than one nugget for 2015!


Have a great 2015! Be sure and drop me a line with how things are going in your IT Department with the Top Ten Strategic Technology Trends for 2015.


Jeff


Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.


Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out more of his posts on Intel’s IT Peer Network

Read more from Jeff on Rivers of Thought

Read more >

Urban Growth and Sustainability: Building Smart Cities with the Internet of Things

Dawn1.jpg




This is the first installment of a four part series on Smart Cities with Dawn Olsen (#1 of 4).

Click here to read blog #2

Click here to read blog #3


Not long ago, the human race hit a significant milestone. In 2009, for the first time in our history, more of us lived in urban areas than rural. It’s estimated that 54% of today’s global population lives in cities and this figure is expected to rocket up to 66% by 2050. With this increase in city inhabitants, we’re quickly heading towards the “Megacity” era.  Soon a city with a population of 10 million or more will seem typical.  As these burgeoning metropolises drive industrial and financial growth on a global scale, the emergence of powerful new economies are beginning to be introduced and developed around the world.


Despite being financial powerhouses, cities can also generate their fair share of problems. For example, they consume two thirds of today’s available energy and other valuable resources, leaving the other third for the millions who still live in smaller settlements and rural areas. As urban populations get bigger, it is vital to make sure that our cities are ready to deal with more people, more traffic, more pollutants and more energy use in a scalable and sustainable way. In short, we need our cities to be smarter.


This is the challenge that gets me out of bed in the morning. I’m excited to be part of Intel’s smart cities initiative, which is focused on putting the Internet of Things (IoT) to use in any way that will benefit urban societies.


Dawn2.jpg

The IoT blocks that build smart cities may include anything from technical components like sensors that measure air quality or temperature, to end-to-end city management solutions that control traffic flow based on analysis of citywide congestion data. The combinations in which these blocks can be applied are almost limitless, and we are exploring innovative new applications to improve quality of life, cost efficiencies and environmental impact.  For example, the work Intel is undertaking with the City of San Jose, California, uses IoT technology to build more sustainable infrastructure, and the project has been recognized by the White House as part of its Smart America initiative. 



In this blog series (this is the first of four posts), I’ll be sharing my thoughts on some of the key areas in which we’re driving the smart cities of the future, based on innovative trials and deployments already completed, or going on now. My blog posts will cover three main areas:

  • Smart security and the evolving challenge of safeguarding our increasingly connected cities
  • Technology driving innovation in traffic and transport management
  • Sustainable solutions to the problem of rising air pollution.


Check back soon for my next post (next Thursday – 1/15/2015), which will explore how Intel’s smart city initiatives can help enhance citizens’ safety and security. I’ll give you a clue: it’s not by just rolling out more CCTV cameras.


Let’s get smart.


This is the first installment of a four part series on Smart Cities with Dawn Olsen (#1 of 4).

Click here to read blog #2

Click here to read blog #3


To continue the conversation, let’s connect on Twitter @DawnOlsen


Dawn Olsen

Global Sales Director

Government Enterprise, Intel

Read more >

Removing Barriers to Business in 2015: Smarter Devices and Services

old new.jpgWe’ve all dealt with it before — how to make legacy technology work so that it doesn’t slow you down. Whether it’s losing a flash drive with a document on it that you spent hours writing, or being left out of the loop because a coworker forgot to hit “reply all” on an important email, or stepping in front of a group to give a presentation only to realize you’re not quite sure just how you’re going to connect your laptop to the screen — we’ve all experienced these workplace faux pas. But what if we didn’t have to deal with these hassles anymore?

 

As a work-from-home employee managing a social media team dispersed around the world, the ability to stay connected and collaborate with my team and customers is critical to our success. I look at the way we interact and wonder: What if the stodgy old productivity barriers of the traditional 9-to-5 office were swept away in a new era of organization, compatibility, and inspiration driven by smarter devices and services?

 

Mobile Puts the “M” in Movement

 

Big ideas can happen anywhere, anytime. Adopting mobile applications in business enables employees to capture and share in real time making mobility an intuitive, simplified experience. Hands-free options like voice commands, gesture input, and smart alerts save time, streamline complex tasks, and make multitasking easier. Video conferencing has also been incredibly impactful on establishing instant connectivity and personal interaction between remote, dispersed employees.


wireless meeting.jpg

Per Michael DeFranco at Forbes, “In 2015, I expect to see a shift from enterprises investing in technologies that just end up giving workers another ‘thing to check’ to those that truly drive productivity by putting the right information into the right hands at the right time.”

 

Wireless Productivity

 

Wireless solutions are popping up everywhere with wireless display, wireless charging, and wireless docking becoming prevalent on both a corporate and consumer level. Wireless can be a game-changer when it comes to creating a connected boardroom, delivering presentations, leveraging data, and enabling more meaningful interactions.

 

Cloud-Enabled Collaboration

 

The future office will bring better use and integration of collaboration tools. Gone will be the days of losing precious work or sifting through an inundated inbox. “Long threads, attachments and elaborate formatting are archaic, confusing and counter to collaboration. Messaging services and apps trump email for all but the most formal or regulated communication, with no single service dominant, as context matters,” states Steven Sinofsky from Re/code.

 

Document security can also be improved by embracing the cloud. Web-based apps allow the business to control who can see and manipulate content. Sinofsky states, “Using cloud-based documents supports an organization knowing where the single, true copy resides, without concerns that the asset will proliferate. Mobile devices can use more secure viewers to see, print and annotate documents, without making copies unnecessarily.”

 

With the use of cloud-driven mobile applications, ease of access to information and services will be important. But so is keeping access to those assets safe and secure. The increased use of multi-factor authentication, including the potential mixture of biometric authentication and RFID, will provide convenience and ensure that only the right person will see the right information.

 

I find that I need to better manage myself and set boundaries between work and life as these new technology tools make collaboration anytime, anywhere easier. Yet I’m excited to know that the future of work might be closer than we think, and I welcome it with open arms in 2015.

 

To continue the conversation on Twitter, please follow me at @chris_p_intel or use #ITCenter.

Read more >

Data Center Design Tips

DC0.jpgData Center Design Tips

When you’re designing a new data center, it’s important to bear in mind that your needs will change as your company expands. Before you start, you should consider some basic issues that will enable you to retain some flexibility when it comes to fitting your current operational needs and letting them grow in the future.

 

Get the Size Right

Keep all your equipment in one location. Consider using enclosed cabinets or Telco racks to take maximum advantage of available space. Nowadays, thanks to the latest server technology, you can pack hundreds of servers into one rack. Choose a room with a ceiling between 14 and 18 feet. This will ensure that your server room can be easily maintained and adequately cooled. Installing a raised floor will enable you to run cables and power whips from a central panel.

 

Keep it Secure 

As well as staying up-to-date on patches, services packs and locking down systems, don’t forget to ensure your physical security it top notch. The last thing you want is someone walking into your server room and tampering your hardware or removing it. Set a policy on who is permitted entry and use a secure keycard system.

 

Protect your System

Make sure that your data center is protected against fire, smoke and water by installing a monitoring system that is capable of alerting you should these issues arise. Consider environmental factors which may increase the risk of physical threats to your server room such as earthquakes, flooding and other forms of severe weather.

With some foresight and careful planning, your new data center can be designed so that it is very secure, and flexible enough to expand with your company’s growth.

 

Our team of experts at Data Center Resources is dedicated to providing cutting-edge services and products. We pride ourselves on our innovative solutions and our continued ability to surpass our customers’ expectations. Since 2002 we have been supplying all our customers with individualized solutions for their data needs and over the years we have formed lasting relationships with our clients. Meet our experts so we providing data solutions for you and your team.

Read more >

3 Innovations Coming to Mobile in 2015

Arif1.PNGThere are a number of innovations in business mobile debuting from Intel and mobile device manufacturers in 2015. These will help enterprises to improve the user experience and productivity of their mobile work forces, and help IT bridge the gap between new user demands and the mobile equipment they offer.

 

  • Form factor innovation

The forthcoming low-power, high-performance Intel Core M processor will enable form factor innovations such as razor-thin tablets that can offer the performance of a PC. Ultra-thin 2 in 1s are also going to be attractive to businesses who will be able to offer smaller and better form factors to their mobile workers and business travellers. These form factors will bridge the gap between new user demands for business client devices, and what the business can offer. The new super-thin tablets and notebooks will offer improvements in user experience, collaboration, productivity and portability.

  • No wires technology

The second innovation coming to mobiles is ‘no wires’ technology, which will simplify life for everyone. It will be a boon for mobile workers and IT departments, and continue the trend of workplace transformation. No wires docking, through Intel WiGig-based Wireless Docking technology, will transform meeting rooms. It gives workers ‘walk up’ convenience as it remembers their preferences, making them instantly productive. No wires docking can be used in combination with wire-free keyboards, mice, printers and monitors – such as the Intel Pro Wireless Display – and is poised to improve office working and modernise both consumer and business computing. Another exciting development coming in the future is ‘no wires charging’ for mobile devices. This will enable mobile workers to lay their device down on the desk so it can be charged automatically.This technology will accompany existing enterprise-class technology such as Intel vPro, which provides security and manageability to wire-free environments. Intel vPro can make wire-free meeting rooms secure, offer remote management and features such as transferring screens from one room to another to enhance collaboration.

  • Stronger, simpler security

Thirdly, more advanced security – which is simpler both to use and administer – is coming to mobile devices. New security innovations will simplify the sign-on process, and reduce the number of passwords required.

 

Mobile devices will increasingly use multi-factor authentication, particularly biometric data such as face recognition, to improve business security. This ‘no passwords’ approach will simplify life for end-users, who will be able to unify their passwords by replacing them with a single biometric input instead.

 

As for IT departments, multi-factor authentication will harden the security of the platform, and will also simplify security management.

 

It will mean single sign-on for multiple cloud services through face recognition or two-factor authentication, and will utilise existing manageability and security features offered by Intel vPro technology.

 

Mobile computing is about to enter a new phase of sophistication.

 

- Arif

Read more >

3 Innovations Coming in Mobile 2015

Arif1.PNGThere are a number of innovations in business mobile debuting from Intel and mobile device manufacturers in 2015. These will help enterprises to improve the user experience and productivity of their mobile work forces, and help IT bridge the gap between new user demands and the mobile equipment they offer.

 

  • Form factor innovation

The forthcoming low-power, high-performance Intel Core M processor will enable form factor innovations such as razor-thin tablets that can offer the performance of a PC. Ultra-thin 2 in 1s are also going to be attractive to businesses who will be able to offer smaller and better form factors to their mobile workers and business travellers. These form factors will bridge the gap between new user demands for business client devices, and what the business can offer. The new super-thin tablets and notebooks will offer improvements in user experience, collaboration, productivity and portability.

  • No wires technology

The second innovation coming to mobiles is ‘no wires’ technology, which will simplify life for everyone. It will be a boon for mobile workers and IT departments, and continue the trend of workplace transformation. No wires docking, through Intel WiGig-based Wireless Docking technology, will transform meeting rooms. It gives workers ‘walk up’ convenience as it remembers their preferences, making them instantly productive. No wires docking can be used in combination with wire-free keyboards, mice, printers and monitors – such as the Intel Pro Wireless Display – and is poised to improve office working and modernise both consumer and business computing. Another exciting development coming in the future is ‘no wires charging’ for mobile devices. This will enable mobile workers to lay their device down on the desk so it can be charged automatically.This technology will accompany existing enterprise-class technology such as Intel vPro, which provides security and manageability to wire-free environments. Intel vPro can make wire-free meeting rooms secure, offer remote management and features such as transferring screens from one room to another to enhance collaboration.

  • Stronger, simpler security

Thirdly, more advanced security – which is simpler both to use and administer – is coming to mobile devices. New security innovations will simplify the sign-on process, and reduce the number of passwords required.

 

Mobile devices will increasingly use multi-factor authentication, particularly biometric data such as face recognition, to improve business security. This ‘no passwords’ approach will simplify life for end-users, who will be able to unify their passwords by replacing them with a single biometric input instead.

 

As for IT departments, multi-factor authentication will harden the security of the platform, and will also simplify security management.

 

It will mean single sign-on for multiple cloud services through face recognition or two-factor authentication, and will utilise existing manageability and security features offered by Intel vPro technology.

 

Mobile computing is about to enter a new phase of sophistication.

 

- Arif

Read more >

Keeping Pace with Technology Innovation in Financial Services

This is the first installment of a five part series on Tech & Finance.  Click here to read blog #2.


Are you a revolutionary?

 

When you think of the Industrial Revolution, I imagine your mind conjures images of 19th century railroads stretching to the horizon and factories belching out steam and smoke. So would it surprise you to know that we’re in the middle of an Industrial Revolution right now?

 

Blalock1.jpgEvery hundred years or so, we go through a period of massive and transformative disruption which affects the way we live and work.  One of the earliest moments we might think of when we consider these transitional periods is the first Industrial Revolution.  In the second Industrial Revolution, we saw the rise of electrification, the combustion engine, the automobile, and mass production. Now we’re witnessing what author Jeremy Rifkin has termed the Third Industrial Revolution, spurred by advances in computing, communications technology and the Internet.

 

Although these revolutions generate huge changes, they take time. Think about electricity. Ben Franklin proved the link between static electricity and lightning in the 1750s, but Edison didn’t invent the lightbulb until 1879.  And it wasn’t until the early 1920s that electricity became pervasive in households.

 

So we’re still fairly close to the beginning of this third revolution, but it’s clear that it’s already having a profound effect on the way we live and work. For starters, there’s the massive and unstoppable rise of what is called the Shared (or Collaborative) Economy. Imagine you’re heading to Singapore on a business trip (lucky you!). In this new world you’d never need to set foot in a hotel, taxi or office, opting instead to use AirBnB,Uber, oDesk or any number of similar services. The access that they offer to share resources through the Internet is creating a new paradigm where access trumps ownership. And it’s not just a fad: AirBnB now has a valuation of $13 billion and offers over 500,000 rooms worldwide, which puts it in the same market place as established hotel brands. Indeed, it’s set to be the second most valuable private company in Silicon Valley according a recent Wall Street Journal article.

 

With this new model piling on the competitive pressure, established companies need to up their pace of change to stay relevant. Easier said than done, right? There may be any number of barriers to change, such as a lack of an ‘innovation’ culture, or restrictive hierarchies and processes. To be more innovative, you need to align HR, culture, IT and facilities, and have a financial model in place to drive workplace transformation. In order to attract and retain the Generation Y employees that will form 75% of the workforce by 2025, this is essential.

 

Blalock2.jpgSo what’s the underlying force driving both the Shared Economy and the demand for workplace transformation? It’s the SMAC stack. SMAC stands for Social, Mobile, Analytics and Cloud – the four technologies that are combining to drive business transformation and productivity. This stack simply didn’t exist five years ago the way that it does today, and it’s enabling innovation and scale at a phenomenal pace.

 

In my role at Intel, I work closely with our customers in the financial services industry, many of whom are experiencing the power of the SMAC stack first-hand. It’s transforming the landscape in which they operate but also helping banks address some of their key challenges, as recently outlined in a report by the banking software company Temenos.

 

These include:

  • Complying with regulations like Basel III, Dodd-Frank (in the US) and MiFID (in Europe) in the wake of the global financial crisis
  • The emergence of new competitors who may not have the same compliance obligations as traditional financial organizations, such as technology and information companies (Google, Apple, PayPal, Alibaba), retailers (Tesco, Walmart), or players from the fringes of banking (AMEX, Simple, GoBank, Ally).
  • Changing customer behaviour, driven by mobility and the ubiquity of information access. Banks must merge their channels across digital banking and transform their branches to create an efficient, secure omni-channel experience for their customers.

 

The bottom line is that banks must comply with regulations and keep the system safe while innovating like a start-up. In this blog series, I’ll be exploring four business imperatives that financial organizations need to embrace in order to turn their technology assets into a powerful differentiator and driver of innovation – and how this will help them to succeed as revolutionaries in the third industrial age.


This is the first installment of a five part series on Tech & Finance.  Click here to read blog #2.


Let’s continue the conversation on Twitter: @blalockm


Mike Blalock

Global Sales Director

Financial Services Industry, Intel

Read more >