Recent Blog Posts

The Data Stack – October 2014 Intel® Chip Chat Podcast Round-up

In October we continued to archive livecast episodes from the Intel Developer Forum with episodes covering robotics and the progress towards artificial intelligence, software defined infrastructure and intelligent data centers, and emerging technologies in the cloud computing industry. If you have a topic you’d like to see covered in an upcoming podcast, feel free to leave a comment on this post!

 

Intel® Chip Chat:

  • Meet Jimmy, the Robot Intel Employee – Intel® Chip Chat episode 347: In this archive of a livecast from IDF, Intel Futurist Brian David Johnson stops by with a special guest, Matt Trossen, the CEO of Trossen Robotics. We’re talking to them about the 21st Century Robot Project, which uses open source hardware and software and 3D printing to make customizable robots driven by apps and connected to other devices for an ecosystem of computation. Trossen Robotics makes the Intel® Edison powered Humanoid Exoskeleton and a 3D printer makes the robot skin, giving makers a platform to start from when innovating with robotics. You can order a robot development kit (and check out the book) at www.21stcenturyrobot.com and learn more at www.trossenrobotics.com.
  • Intelligent Infrastructure for the Digital Services Economy – Intel® Chip Chat episode 348: In this archive of a livecast from the Intel Developer Forum, Johnathan Donaldson (@jdonalds), the GM of Software Defined Infrastructure in the Cloud Platforms Group at Intel, stops by to talk about building intelligent infrastructure for enhanced platform and capabilities awareness, as well as dynamic workload placement and configuration. In a digital services economy, intelligent infrastructure is critical for development cycles and time to market. For more information, visit www.intel.com and search for software defined infrastructure.
  • Building Data Centers with Intelligence – Intel® Chip Chat episode 349: In this archive of a livecast from the Intel Developer Forum, we’ve got three great interviewees discussing the intelligent data center. Das Kamhout, a Cloud Orchestration Architect and Principal Engineer at Intel; Scott Carlson, an Information Security Architect for PayPal; and John Wilkes, a Principal Software Engineer at Google are on hand to talk about building, scaling, securing, and adding an intelligent software layer to modern data centers. For PayPal, a top priority is how to protect the money flow and retain customer trust. Google focuses on building smart systems that can scale massively and offer high reliability. For more information, visit: intel.ly/orchestration.
  • Next-gen Computing for Enterprises – Intel® Chip Chat episode 350: In this archive of a livecast from the Intel Developer Forum, Paul Miller (@PaulMiller), the founder of Cloud of Data, chats about various hot topics in enterprise computing including orchestration and telemetry for on demand, cost effective workload distribution, personalized medicine and data protection/privacy, and the evolution of public and private clouds and the emergence of containers. For more information, visit www.cloudofdata.com.

Read more >

Moving to the Cloud – Should the CIO Focus More on Systems of Engagements?

In a short video, Geoffrey Moore describes the focus evolution from Systems of Records to Systems of Engagement. At the end, he highlights the fact that systems of engagement probably require a very different type of IT than systems of records.

 

Systems of record host the key processes and data elements of the enterprise. Most often they have been implemented prior to the year 2000 to ensure enterprise survival in the new millennium. Great efforts and vast amounts of money went into implementing these systems and adapting both the enterprise and the software to one another. Since then, these systems have continued to run reliably and support enterprise operations.

 

But a couple of things happened.

 

Users, getting acquainted with having information at their fingertips through smart phones and other mobile devices, are now asking for access to the systems of records. New interaction mechanisms, such as social media, offer new sources of information that provides a better understanding of market needs, customer demand and the overall environment in which the enterprise operates. Actually, the world is increasingly becoming digital. The boundaries between business and IT are shrinking as every business interactions these days involve the use of information technology in one way or another.

 

In parallel, time has been shrinking.  What we expected, 10 or 15 years ago, to take several hours or days, can now be done in a matter of minutes due to the rapid advancements in IT. Hence the new style of IT, as Meg Whitman calls it, is now required to respond to user needs. Cloud is definitely part of this transformation in IT as it provides enterprises with the responsiveness and agility required to address today’s ever changing business environment.

 

As enterprises decide to move to the cloud, the question – where to start? – typically comes up. In a blog entry I published over one year ago, I spoke about five use cases companies could envisage to start their cloud journey. It ultimately depends on the decision to consume services from a cloud environment (being it private, managed or public). So, the question of – which application should be moved to the cloud first? – is raised. Should we start with a systems of record or a systems of engagement?


Should we start with Systems of Record?

As stated by Geoffrey Moore, most Systems of Record have been in place for the last 10 to 15 years. They are transaction based and focus on facts, dates and commitments. They often represent a single source of the truth. Indeed, they typically have been built around a single database, containing mostly structured information. Every time information is changed, the event is logged, so one can quickly find out who did what. Data is kept in the systems for extensive periods of time to ensure compliance while access is regulated and contained. They are the core support systems for the operations of the enterprise.

 

A couple companies focused on the development of such systems –  the most well-known being SAP and Oracle for their financial and manufacturing systems. Other enterprises may have written their own application, being left with a small team of knowledgeable resources to maintain it. These systems are considered business critical as the company cannot operate without them any longer. They contain the “single version of the truth”. And I can speak from my own experience, even if you disagree with those numbers (for example –  if deals have been mis-categorized), you will have great difficulty convincing higher levels of management that the data is incorrect.

 

Some enterprises may require increased flexibility in the use of such systems; they may want increased agility and responsiveness in case of a merger or divestiture. But are these the systems we should migrate to cloud first?

 

What would be the benefit? Well we could probably run them cheaper; we may be able to give our users that additional levels of responsiveness, agility and flexibility that they are looking for. But on the other hand, we would have to modernize an environment that runs well and supports the business of the enterprise on a daily basis. Or we should rebuild a brand new system of record based on the latest version of the software. This may be the only option, but to me… that sounds risky.

 

I’ve seen a couple of companies doing this, but it was mostly in the case of a merger, divestiture, consolidation of systems or a move to a new data center or IT delivery mechanism. And in most of these cases, it turned out that capabilities available on the cloud including – automation, flex-up/flex-down and service – were not fully taken into consideration during the installation of the application.

 

Now, employees might want access to the systems of record through their mobile devices.  They might want a more friendly user interface.  They might want to combine functions that are separate from the original system. This is a whole different ballgame.

 

Using web services, we could encapsulate the system of record and give the user what they want without having to disrupt the original environment. Over time we could consider updating/transforming some of the functionality and shut them down in the original package and replace them with cloud based functionality. This reduces the risk and shields the end-user from the actual package, making it easier to transform the system of record without overwhelming disruptions.


What is different with Systems of Engagement?

Systems of engagements were developed some time after system of records – so they involve newer technologies. In particular, many of them are built around SOA principles, making them more suitable to take full advantage of cloud technology. Their objectives are interactions and collaboration. It’s all about sharing insights, ideas and nuances. They are used within the frame of business opportunities and projects making the relationships transient in nature while requiring responsiveness and agility to be set-up quickly. Access is ad-hoc and in many companies may require a partner to customer interaction. Most often, information is unstructured which makes search more difficult.

 

Obviously, Systems of Engagement are important, but they do not maintain the critical information needed to run a company. They are important as a mechanism to share information, gain consensus and make decisions.  However, they do not maintain the single source of the truth. That makes them more suitable for being experimented with. Their nature, needs and technologies used to build them makes them better candidates to be migrated to cloud. So, I would suggest that this is where we should start. Of course, we don’t want our end-users to be left in the cold if something happens during the migration.  But even in the worst case scenario, telephones can still be used to allow people to exchange information even if the system were down for some time.


The importance of data

Tony Byrne argues that Geoffrey Moore simplifies things by creating two clearly different categories. He points out that the issue is probably messier in real life. On the one hand, people are discussing important business decisions in collaboration systems – thereby creating records, while others may want to engage with their colleagues directly from the systems of record.  Byrne explains it in simple terms: “your colleagues are creating records while they engage, and seeking to engage while they manage formal documents and participate in structured processes. Ditto for your interactions with customers and other partners beyond your firewall.

 

Now, we have been able to trigger functionality from within applications for quite some time.  That’s not the issue. And the use of web services described earlier makes this reasonably easy to implement.

 

The focus of Tony’s discussion is how the data can be moved between the systems of record and the systems of engagement. Right from the start, you should think about your data sources and information management. Again, technology exists today to access data within and outside a cloud environment. What’s important is to figure out what data should be used when and where, while ensuring that it is properly managed along the way. If you access and change data in a systems of record, do it in such a way that all the checking, security and logging functionality is respected. But this should be nothing new. Companies have been integrating external functionality within their systems of record for years.


Conclusion

When companies look at migrating to cloud, the question of where to begin is often debated. In my mind, it’s important to show end-users the benefits of the cloud early on. That lends me to lean more towards starting with systems of engagements by either transforming existing ones or building new ones that will positively surprise users. This will get their buy-in and give IT more “cloud” to transform the remainder of the IT environment. The real question is: how far you need to go? Because not everything has to be in the cloud.  At the end of the day, you should only move what makes sense.

Read more >

Game Over! Gamification and the CIO

Congratulations to the winner of the CIO Superhero Game!


The first to complete the challenge and become “Super CIO” was Brad Ton of Reindeer Auto Relocation. He was presented with an AWESOME Trophy to display proudly in his office. To win, Mr. Ton had to defeat five evil henchmen and the arch enemy of CIOs everywhere, Complacent IT Guy. 20141029_073410.jpg


So what is this craziness about CIO Superheroes? Just my dorky way of introducing another one of the challenges impacting the CIO today…Gamification.


Gamifi-what?


Gamification, the use of game thinking and game mechanics in non-game contexts to engage people (players) in solving problems.


According to Bunchball, one of the leading firms in applying gamification to business processes, gamification is made up of two major components. The first is game mechanics – things like points, leaderboards, challenges levels – the pieces that make game playing fun, engaging, and challenging. In other words, the elements that create competition. The second component is game dynamics – things like rewards, achievement, and a sense of competition.


Games are everywhere. People love to compete and people love to play games. What does this have to do with the role of CIO?


Everything!


Want to improve customer engagement? Make it a game! Want employees to embrace a new process? Make it a game! Want to improve performance? Make it a game! Add game mechanics and game dynamics into the next app you are building, layer them on an existing application, and put them in the next process improvement initiative.


Even sites like the Intel IT Peer Network use game theory to increase engagement. You earn points for all kinds of activity, which might include logging in multiple times, using searches to find content or posting a blog. I find it interesting that while these points earn you badges and levels, they actually offer minimal intrinsic value. Nevertheless, I found myself disappointed during a recent systems upgrade to have my points reset to zero. Alas, I am an Acolyte again!


Now, back to the CIO Superhero Game.


Recently, I had the opportunity to interview our Superhero CIO winner. Here are just a few of his thoughts surrounding gamification. smallish_CIOHero.png


What caught your attention enough to want to play the game?


“I really thought that Twitter was a very unique way to play a game.  It was not something I had ever done before.  I’m a frequent reader of all of your writings, so I knew I was in for a learning experience if anything else.  I’m a Twitter addict, so I felt comfortable diving into the CIO world even as someone not extremely knowledgeable on the topic.  Frankly, I’d rather have an hour long dentist appointment than read an instruction manual.  This was easily accessible – right at my fingertips and very self-explanatory.”


Games can engage, games can inspire, games can teach. What is the most important lesson you learned from playing the game?


“While I never truly understood the intricacies of being a CIO, I always appreciated the hard work and dedication it took to get to such a prestigious level.  After going through the CIO Superhero game, I can honestly say that I now genuinely respect it.  The passion behind the game was something I enjoyed.  It wasn’t a bland exercise built with little thought or substance.  I could feel that the game was designed to teach and help grow others into not only understanding new topics previously unknown – but to inspire them into being pro-active in sharing & creating their own ideas.   That is when you know you have something special.  More than the topics themselves, the passion behind what the game was meant to do is what was really able to draw me in.”   


You are not a CIO yourself, do you think gamification of a process would work in your business, and if so, can you give an example?


“Any tool that can supply a different approach to creating a better understanding of a current process is always worth the attempt.  I also think the concept of Gamification is able to provide a different perspective, which can spark new ways to think about old processes. Implementing gamification could highlight the variables within our industry that can, in turn, allow for a more personable approach.  Cost, scheduling, bookings…logistics are important, but the game tailored to our industry could be much more personal and deal directly with relationships of all parties involved in a relocation. Whereas, a typical goal would be to complete an on-time relocation with small out-of-pocket costs, the game’s primary objective would be to receive positive feedback from customers, clients, etc.   Yes, on-time and small cost could equate to this outcome, but not always.  “Defeat the evil henchmen” by coming up with a new idea to improve customer service, for instance.  By defining the game’s objectives from a relationship standpoint, you can spark new and creative ways of thinking.”


So, there you have it.


Gamification – just another element within the myriad of changes impacting the CIO today. It truly is a “game” changer that can increase adoption and engagement across a variety of businesses and processes.


This is a continuation of a series of posts titled “The CIO is Dead! Long Live the CIO!” looking at the confluence of changes impacting the CIO and IT leadership. #CIOisDead. Next up “Faster than a speeding bullet – The Speed of Change”.

Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.


Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out his posts on Intel’s IT Peer Network

Read more from Jeff on Rivers of Thought

Read more >

Unleashing the Digital Services Economy

Today Intel delivered a keynote address to 1000+ attendees at the Open Compute Project European Summit, in Paris. The keynote, delivered by Intel GM Billy Cox, covered Intel’s strategy to accelerate the digital services economy by delivering disruptive technology innovation founded on industry standards. The foundation of Intel’s strategy is an expansion of silicon innovation to augment its traditional Xeon, Xeon Phi and Atom solutions with expansion of product offerings through new standard SKUs and custom solutions based on specific workload requirements. Intel is expanding its data center SoC product line with the planned introduction of a Xeon based SoC in early 2015, which is sampling now. This will be Intel’s 3rd generation 64-bit SoC solution.

 

 

 

 

To further highlight this disruptive innovation, Cox described how Intel is working closely with industry leaders Facebook and Microsoft on separate collaborative engineering efforts to deliver innovative and more efficient solutions for the data center. Cox detailed how Intel and Facebook engineers worked together on Facebook’s delivery of the new Honey Badger storage server for their photo storage tier featuring the Intel® Atom™ processor C2000, a 64-bit system-on-chip. The high capacity, high density storage server offers up to 180TB in a 2U form factor and is expected to be deployed in 1H’15.  Cox also detailed how Microsoft has completed the 2nd generation Open Cloud Server (OCSv2) specification. Intel and Microsoft have jointly developed a board to go into OCSv2 that features a dual-processor design, built on the Intel Xeon E5-2600 v3 series processor that enables 28 cores of compute power per blade.

 

 

Collaboration with Open Compute reflects Intel’s decades long history of collaborating with industry organizations to accelerate computing innovation. As one of the 5 founding board members of the Open Compute Project, we are deeply committed to enabling broad industry innovation by openly sharing specifications and best practices for high efficiency data center infrastructure.  Intel is involved in many OCP working group initiatives spanning rack, compute, storage, network, C&I  and management which are strategically aligned with our vision of accelerating rack scale optimization for cloud computing.

 

At the summit, Intel and industry partners are demonstrating production hardware based on our Open Compute specifications. We look forward to working with the community to help push datacenter innovation forward.

Read more >

Adopting & Enabling OpenStack in the Enterprise: A look at OpenStack Summit 2014

As I discuss the path to cloud with customers, one topic that is likely to come up is OpenStack.  It’s easy to understand the inherent value in OpenStack as an open source orchestration solution, but this value is balanced by ever present questions on OpenStack’s readiness for the complex environments found in telco and enterprise.  Will OpenStack emerge as a leading presence in these environments, and in what timeframe?  What have lead adopters experienced with early implementations and POCs…are there pitfalls to avoid, and how can we use these learnings to drive the next wave of adoption?

 

This was most recently a theme at the Intel Developer Forum where I caught up with Intel’s Jonathan Donaldson and Das Kamhout on Intel’s strategy for orchestration and effort to take key learnings from the world’s most sophisticated data centers to apply to broad implementations.  However, Intel is certainly not new to the OpenStack arena having been involved in the community from its earliest days and more recently having delivered Service Assurance Administrator, a key tool that enables OpenStack environments better insight into underlying infrastructure attributes.  Intel has even helped lead the charge of enterprise implementation with integration of OpenStack into Intel’s own internal cloud environment.

 

These lingering questions on broad enterprise and telco adoption, however, make the upcoming OpenStack Summit a must attend event for me this month.  With an event loaded with discussions from leading enterprise and telco experts from companies like BMW, Telefonica, and Workday on their experiences with OpenStack, I’m expecting to get much closer to the art of the possible in OpenStack deployment as well as learn more about how OpenStack providers are progressing with enterprise friendly offerings.  If you’re attending the Summit please be sure to check out Intel’s line up of sessions and technology demonstrations and connect with Intel executives on site discussing our engagements in the OpenStack community and work with partners and end customers to help drive broad use of OpenStack into enterprise and telco environments. If you don’t have the Summit in your travel plans, never fear.  Intel will help bring the conference to you!  I’ll be hosting two days of livecast interviews from the floor of the Summit. We’ll also be publishing a daily recap of the event on the DataStack with video highlights, the best comments from the Twitterverse, and much more.  Please send input on the topics that you want to hear about coming from OpenStack to ensure that our updates match the topics you care about. #OpenStack

Read more >

Going Green With Your Data Center Strategy

Green data center.jpgFor an enterprise attempting to maximize energy efficiency, the data center has long been one of the greatest sticking points. A growing emphasis on cloud and mobile means growing data centers, and by nature, they demand a gargantuan level of energy in order to function. And according to a recent survey on global electricity usage, data centers are sucking more energy than ever before.

 

George Leopold, senior editor at EnterpriseTech, recently dissected Mark P. Mills’ study entitled, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, And Big Power.” The important grain of salt surrounding the survey is that funding stemmed from the National Mining Association and the American Coalition for Clean Coal Electricity, but there were some stark statistics that shouldn’t be dismissed lightly.

 

“The average data center in the U.S., for example, is now well past 12 years old — geriatric class tech by ICT standards. Unlike other industrial-classes of electric demand, newer data facilities see higher, not lower, power densities. A single refrigerator-sized rack of servers in a data center already requires more power than an entire home, with the average power per rack rising 40% in the past five years to over 5 kW, and the latest state-of-the-art systems hitting 26 kW per rack on track to doubling.”

 

More Power With Less Energy

 

As Leopold points out in his article, providers are developing solutions to circumvent growing demand while still cutting carbon footprint. IT leaders can rethink energy usage by concentrating on air distribution and attempting assorted cooling methods. This ranges from containment cooling to hot huts (a method pioneered by Google). And thorium-based nuclear reactors are gaining traction in China, but don’t necessarily solve waste issues.

 

If the average data center in the U.S. is older than 12-years old, IT leaders need to start looking at the tech powering their data center and rethink the demand on the horizon. Perhaps the best way to go about this is thinking about the foundation of the data center at hand.

 

Analysis From the Ground Up

 

Intel IT has three primary areas of concern when choosing a new data center site: environmental conditions, fiber and communications infrastructure, and power infrastructure. These three criteria bear the greatest weight on the eventual success — or failure — of a data center. So when you think about your data center site in the context of the given criteria, ask yourself: Was the initial strategy wise? How does the threat proximity compare to the resource proximity? What does the surrounding infrastructure look like and how does that affect the data center? If you could go the greenfield route and build an entirely new site, what would you retain and what would you change?

 

Every data center manager in every enterprise has likely considered the almost counterintuitive concept that more power can come with less energy. But doing more with less has been the mantra since the beginning of IT. It’s a challenge inherent to the profession. Here at Intel, we’ll continue to provide invaluable resources to managers looking to get the most out of their data center.

 

To continue the conversation, please follow us at @IntelITCenter or use #ITCenter.

Read more >

5 Questions for Howard A. Zucker, MD, JD

Health IT is a hot topic in the Empire State. New York was the first state to host an open health data site and is now in the process of building the Statewide Health Information Network of New York. The SHIN-NY will enable providers to access patient records from anywhere in the state.

 

To learn more, we caught up with Howard A. Zucker, MD, JD, who was 22 when he got his MD from George Washington University School of Medicine and became one of America’s youngest doctors. Today, Zucker is the Acting Commissioner of Health for New York State, a post he assumed in May 2014. Like his predecessor Nirav R. Shah, MD, MPH, Zucker is a technology enthusiast, who sees EHRs, mobile apps and telehealth as key components to improving our health care system. Here, he shares his thoughts.

 

What’s your vision for patient care in New York in the next five years?

 

Zucker: Patient care will be a more seamless experience for many reasons. Technology will allow for further connectivity. Patients will have access to their health information through patient portals. Providers will share information on the SHIN-NY. All of this will make patient care more fluid, so that no matter where you go – a hospital, your doctor’s office or the local pharmacy – providers will be able to know your health history and deliver better quality, more individualized care. And we will do this while safeguarding patient privacy.

 

I also see a larger proportion of patient care taking place in the home. Doctors will take advantage of technologies like Skype and telemedicine to deliver that care. This will happen as patients take more ownership of their health. Devices like FitBit amass data about health and take steps to improve it. It’s a technology still in its infancy, but it’s going to play a major role in long term care. zucker_263x329.jpg

 

How will technology shape health care in New York and beyond?

 

Zucker: Technology in health and medicine is rapidly expanding – it’s already started. Genomics and proteomics will one day lead to customized medicine and treatments tailored to the individual. Mobile technology will provide patient data to change behaviors. Patients and doctors alike will use this type of technology. As a result, patients will truly begin to “own” their health.

 

Personally, I’d like to see greater use of technology for long-term care. Many people I know are dealing with aging parents and scrambling to figure out what to do. I think technology will enable more people to age in place in ways that have yet to unfold.

 

What hurdles do you see in New York and how can you get around those?

 

Zucker: Interoperability remains an ongoing concern. If computers can’t talk to each other, then this seamless experience will be extremely challenging.

 

We also need doctors to embrace and adopt EHRs. Many of them are still using paper records. But it’s challenging to set up an EHR when you have patients waiting to be seen and so many other clinical care obligations. Somehow, we need to find a way to make the adoption and implementation process less burdensome. Financial incentives alone won’t work.

 

How will mobility play into providing better patient care in New York?

 

Zucker: The human body is constantly giving us information, but only recently have we begun to figure out ways to receive that data using mobile technology. Once we’ve mastered this, we’re going to significantly improve patient care.

 

We already have technology that collects data from phones, and we have sensors that monitor heart rate, activity levels and sleep patterns. More advanced tools will track blood glucose levels, blood oxygen and stress levels.

 

How will New York use all this patient-generated health data?

 

Zucker: We have numerous plans for all this data, but the most important will be using it to better prevent, diagnose and treat disease. Someday soon, the data will help us find early biomarkers of disease, so that we can predict illness well in advance of the onset of symptoms. We will be able to use the data to make more informed decisions on patient care.

Read more >

Delivering on Choice for Hybrid Cloud Customers with Intel & EMC

Stu Goldstein is a Market Development Manager in the Communications and Storage Infrastructure Group at Intel

 

When purchasing a new laptop for my sons as they went off to college, a big part of the brand decision revolved around support and my peace of mind.  Sure enough, one of my sons blew his motherboard when he plugged into an outlet while spending a summer working in China, and the other trashed his display when playing jump rope with his power cord, pulling his PC off the bed. In both cases I came away feeling good about the support received from the brand that I trusted.  

 

So, knowing a bit more about how I think, it should not be a big surprise that I see the EMC Hybrid Cloud announcement today as important. Enterprises moving to Converged Software Defined Storage Infrastructures should have choices.  EMC is offering the Enterprise the opportunity to evolve without abandoning a successfully engineered infrastructure, including the support that will inevitably be needed. The creation of products that maximize existing investments, while providing the necessary path to a secure hybrid cloud is proof of EMC’s commitment to choice.  Providing agility moving forward without short circuiting security and governance can be difficult; EMC’s announcement today recognizes the challenge.  Offering a VMware edition is not surprising; neither is the good news about supporting a Microsoft edition.  However, a commitment to “Fully Engineered OpenStack Solutions,” is a big deal.  Intel is a big contributor to open source including OpenStack, so it is great to see this focus from EMC.

 

EMC has proven over the last several years that they can apply much of the underlying technologies that the Intel® Xeon® processors combined with Intel® Ethernet Converged Network Adapters have to offer.  When Intel provided solutions that increased memory bandwidth by 60% and doubled I/O bandwidth generation over generation, EMC immediately asked, “what’s next?”  Using these performance features coupled with Intel virtualization advances, the VMAX³ and VNX solutions prove EMC is capable of moving Any Data, Anytime, Anywhere while maintaining VMs that are isolated to allow for secure shared tenancy.  Now EMC is intent on proving it is serious about expanding the meaning of Anywhere.  (BTW, the XtremIO Scale Out products are a great example of Anytime, using Intel Architecture advancements to maintain a consistent 99% latency at less than 1 ms to provide the steady performance that customers need to take the most advantage possible of this all flash array).  EMC is in a unique position to offer customers of its products in the enterprise the ability to extend benefits derived from highly optimized deduplication, compression, flash, memory, I/O and virtualization technology into the public cloud.

 

Getting back to support which is a broad term that comes into laser focus when you need it.  It has to come from a trusted source no matter whether your storage scales up, out, is open, sort of open or proprietary.  It costs something whether you rely on open source distros, OEMs or smart people hired to build and support home grown solutions. EMC’s Hybrid Cloud announcement is a recognition that adding IaaS needs backing that covers you inside and out or maybe said another way, from the inside into the outside.  I look forward to seeing what IT Managers do with EMC’s choices and the innovation this initiative brings to the cloud.

Read more >