Recent Blog Posts

Game Over! Gamification and the CIO

Congratulations to the winner of the CIO Superhero Game!


The first to complete the challenge and become “Super CIO” was Brad Ton of Reindeer Auto Relocation. He was presented with an AWESOME Trophy to display proudly in his office. To win, Mr. Ton had to defeat five evil henchmen and the arch enemy of CIOs everywhere, Complacent IT Guy. 20141029_073410.jpg


So what is this craziness about CIO Superheroes? Just my dorky way of introducing another one of the challenges impacting the CIO today…Gamification.


Gamifi-what?


Gamification, the use of game thinking and game mechanics in non-game contexts to engage people (players) in solving problems.


According to Bunchball, one of the leading firms in applying gamification to business processes, gamification is made up of two major components. The first is game mechanics – things like points, leaderboards, challenges levels – the pieces that make game playing fun, engaging, and challenging. In other words, the elements that create competition. The second component is game dynamics – things like rewards, achievement, and a sense of competition.


Games are everywhere. People love to compete and people love to play games. What does this have to do with the role of CIO?


Everything!


Want to improve customer engagement? Make it a game! Want employees to embrace a new process? Make it a game! Want to improve performance? Make it a game! Add game mechanics and game dynamics into the next app you are building, layer them on an existing application, and put them in the next process improvement initiative.


Even sites like the Intel IT Peer Network use game theory to increase engagement. You earn points for all kinds of activity, which might include logging in multiple times, using searches to find content or posting a blog. I find it interesting that while these points earn you badges and levels, they actually offer minimal intrinsic value. Nevertheless, I found myself disappointed during a recent systems upgrade to have my points reset to zero. Alas, I am an Acolyte again!


Now, back to the CIO Superhero Game.


Recently, I had the opportunity to interview our Superhero CIO winner. Here are just a few of his thoughts surrounding gamification. smallish_CIOHero.png


What caught your attention enough to want to play the game?


“I really thought that Twitter was a very unique way to play a game.  It was not something I had ever done before.  I’m a frequent reader of all of your writings, so I knew I was in for a learning experience if anything else.  I’m a Twitter addict, so I felt comfortable diving into the CIO world even as someone not extremely knowledgeable on the topic.  Frankly, I’d rather have an hour long dentist appointment than read an instruction manual.  This was easily accessible – right at my fingertips and very self-explanatory.”


Games can engage, games can inspire, games can teach. What is the most important lesson you learned from playing the game?


“While I never truly understood the intricacies of being a CIO, I always appreciated the hard work and dedication it took to get to such a prestigious level.  After going through the CIO Superhero game, I can honestly say that I now genuinely respect it.  The passion behind the game was something I enjoyed.  It wasn’t a bland exercise built with little thought or substance.  I could feel that the game was designed to teach and help grow others into not only understanding new topics previously unknown – but to inspire them into being pro-active in sharing & creating their own ideas.   That is when you know you have something special.  More than the topics themselves, the passion behind what the game was meant to do is what was really able to draw me in.”   


You are not a CIO yourself, do you think gamification of a process would work in your business, and if so, can you give an example?


“Any tool that can supply a different approach to creating a better understanding of a current process is always worth the attempt.  I also think the concept of Gamification is able to provide a different perspective, which can spark new ways to think about old processes. Implementing gamification could highlight the variables within our industry that can, in turn, allow for a more personable approach.  Cost, scheduling, bookings…logistics are important, but the game tailored to our industry could be much more personal and deal directly with relationships of all parties involved in a relocation. Whereas, a typical goal would be to complete an on-time relocation with small out-of-pocket costs, the game’s primary objective would be to receive positive feedback from customers, clients, etc.   Yes, on-time and small cost could equate to this outcome, but not always.  “Defeat the evil henchmen” by coming up with a new idea to improve customer service, for instance.  By defining the game’s objectives from a relationship standpoint, you can spark new and creative ways of thinking.”


So, there you have it.


Gamification – just another element within the myriad of changes impacting the CIO today. It truly is a “game” changer that can increase adoption and engagement across a variety of businesses and processes.


This is a continuation of a series of posts titled “The CIO is Dead! Long Live the CIO!” looking at the confluence of changes impacting the CIO and IT leadership. #CIOisDead. Next up “Faster than a speeding bullet – The Speed of Change”.

Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.


Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out his posts on Intel’s IT Peer Network

Read more from Jeff on Rivers of Thought

Read more >

Unleashing the Digital Services Economy

Today Intel delivered a keynote address to 1000+ attendees at the Open Compute Project European Summit, in Paris. The keynote, delivered by Intel GM Billy Cox, covered Intel’s strategy to accelerate the digital services economy by delivering disruptive technology innovation founded on industry standards. The foundation of Intel’s strategy is an expansion of silicon innovation to augment its traditional Xeon, Xeon Phi and Atom solutions with expansion of product offerings through new standard SKUs and custom solutions based on specific workload requirements. Intel is expanding its data center SoC product line with the planned introduction of a Xeon based SoC in early 2015, which is sampling now. This will be Intel’s 3rd generation 64-bit SoC solution.

 

 

 

 

To further highlight this disruptive innovation, Cox described how Intel is working closely with industry leaders Facebook and Microsoft on separate collaborative engineering efforts to deliver innovative and more efficient solutions for the data center. Cox detailed how Intel and Facebook engineers worked together on Facebook’s delivery of the new Honey Badger storage server for their photo storage tier featuring the Intel® Atom™ processor C2000, a 64-bit system-on-chip. The high capacity, high density storage server offers up to 180TB in a 2U form factor and is expected to be deployed in 1H’15.  Cox also detailed how Microsoft has completed the 2nd generation Open Cloud Server (OCSv2) specification. Intel and Microsoft have jointly developed a board to go into OCSv2 that features a dual-processor design, built on the Intel Xeon E5-2600 v3 series processor that enables 28 cores of compute power per blade.

 

 

Collaboration with Open Compute reflects Intel’s decades long history of collaborating with industry organizations to accelerate computing innovation. As one of the 5 founding board members of the Open Compute Project, we are deeply committed to enabling broad industry innovation by openly sharing specifications and best practices for high efficiency data center infrastructure.  Intel is involved in many OCP working group initiatives spanning rack, compute, storage, network, C&I  and management which are strategically aligned with our vision of accelerating rack scale optimization for cloud computing.

 

At the summit, Intel and industry partners are demonstrating production hardware based on our Open Compute specifications. We look forward to working with the community to help push datacenter innovation forward.

Read more >

Adopting & Enabling OpenStack in the Enterprise: A look at OpenStack Summit 2014

As I discuss the path to cloud with customers, one topic that is likely to come up is OpenStack.  It’s easy to understand the inherent value in OpenStack as an open source orchestration solution, but this value is balanced by ever present questions on OpenStack’s readiness for the complex environments found in telco and enterprise.  Will OpenStack emerge as a leading presence in these environments, and in what timeframe?  What have lead adopters experienced with early implementations and POCs…are there pitfalls to avoid, and how can we use these learnings to drive the next wave of adoption?

 

This was most recently a theme at the Intel Developer Forum where I caught up with Intel’s Jonathan Donaldson and Das Kamhout on Intel’s strategy for orchestration and effort to take key learnings from the world’s most sophisticated data centers to apply to broad implementations.  However, Intel is certainly not new to the OpenStack arena having been involved in the community from its earliest days and more recently having delivered Service Assurance Administrator, a key tool that enables OpenStack environments better insight into underlying infrastructure attributes.  Intel has even helped lead the charge of enterprise implementation with integration of OpenStack into Intel’s own internal cloud environment.

 

These lingering questions on broad enterprise and telco adoption, however, make the upcoming OpenStack Summit a must attend event for me this month.  With an event loaded with discussions from leading enterprise and telco experts from companies like BMW, Telefonica, and Workday on their experiences with OpenStack, I’m expecting to get much closer to the art of the possible in OpenStack deployment as well as learn more about how OpenStack providers are progressing with enterprise friendly offerings.  If you’re attending the Summit please be sure to check out Intel’s line up of sessions and technology demonstrations and connect with Intel executives on site discussing our engagements in the OpenStack community and work with partners and end customers to help drive broad use of OpenStack into enterprise and telco environments. If you don’t have the Summit in your travel plans, never fear.  Intel will help bring the conference to you!  I’ll be hosting two days of livecast interviews from the floor of the Summit. We’ll also be publishing a daily recap of the event on the DataStack with video highlights, the best comments from the Twitterverse, and much more.  Please send input on the topics that you want to hear about coming from OpenStack to ensure that our updates match the topics you care about. #OpenStack

Read more >

Going Green With Your Data Center Strategy

Green data center.jpgFor an enterprise attempting to maximize energy efficiency, the data center has long been one of the greatest sticking points. A growing emphasis on cloud and mobile means growing data centers, and by nature, they demand a gargantuan level of energy in order to function. And according to a recent survey on global electricity usage, data centers are sucking more energy than ever before.

 

George Leopold, senior editor at EnterpriseTech, recently dissected Mark P. Mills’ study entitled, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, And Big Power.” The important grain of salt surrounding the survey is that funding stemmed from the National Mining Association and the American Coalition for Clean Coal Electricity, but there were some stark statistics that shouldn’t be dismissed lightly.

 

“The average data center in the U.S., for example, is now well past 12 years old — geriatric class tech by ICT standards. Unlike other industrial-classes of electric demand, newer data facilities see higher, not lower, power densities. A single refrigerator-sized rack of servers in a data center already requires more power than an entire home, with the average power per rack rising 40% in the past five years to over 5 kW, and the latest state-of-the-art systems hitting 26 kW per rack on track to doubling.”

 

More Power With Less Energy

 

As Leopold points out in his article, providers are developing solutions to circumvent growing demand while still cutting carbon footprint. IT leaders can rethink energy usage by concentrating on air distribution and attempting assorted cooling methods. This ranges from containment cooling to hot huts (a method pioneered by Google). And thorium-based nuclear reactors are gaining traction in China, but don’t necessarily solve waste issues.

 

If the average data center in the U.S. is older than 12-years old, IT leaders need to start looking at the tech powering their data center and rethink the demand on the horizon. Perhaps the best way to go about this is thinking about the foundation of the data center at hand.

 

Analysis From the Ground Up

 

Intel IT has three primary areas of concern when choosing a new data center site: environmental conditions, fiber and communications infrastructure, and power infrastructure. These three criteria bear the greatest weight on the eventual success — or failure — of a data center. So when you think about your data center site in the context of the given criteria, ask yourself: Was the initial strategy wise? How does the threat proximity compare to the resource proximity? What does the surrounding infrastructure look like and how does that affect the data center? If you could go the greenfield route and build an entirely new site, what would you retain and what would you change?

 

Every data center manager in every enterprise has likely considered the almost counterintuitive concept that more power can come with less energy. But doing more with less has been the mantra since the beginning of IT. It’s a challenge inherent to the profession. Here at Intel, we’ll continue to provide invaluable resources to managers looking to get the most out of their data center.

 

To continue the conversation, please follow us at @IntelITCenter or use #ITCenter.

Read more >

5 Questions for Howard A. Zucker, MD, JD

Health IT is a hot topic in the Empire State. New York was the first state to host an open health data site and is now in the process of building the Statewide Health Information Network of New York. The SHIN-NY will enable providers to access patient records from anywhere in the state.

 

To learn more, we caught up with Howard A. Zucker, MD, JD, who was 22 when he got his MD from George Washington University School of Medicine and became one of America’s youngest doctors. Today, Zucker is the Acting Commissioner of Health for New York State, a post he assumed in May 2014. Like his predecessor Nirav R. Shah, MD, MPH, Zucker is a technology enthusiast, who sees EHRs, mobile apps and telehealth as key components to improving our health care system. Here, he shares his thoughts.

 

What’s your vision for patient care in New York in the next five years?

 

Zucker: Patient care will be a more seamless experience for many reasons. Technology will allow for further connectivity. Patients will have access to their health information through patient portals. Providers will share information on the SHIN-NY. All of this will make patient care more fluid, so that no matter where you go – a hospital, your doctor’s office or the local pharmacy – providers will be able to know your health history and deliver better quality, more individualized care. And we will do this while safeguarding patient privacy.

 

I also see a larger proportion of patient care taking place in the home. Doctors will take advantage of technologies like Skype and telemedicine to deliver that care. This will happen as patients take more ownership of their health. Devices like FitBit amass data about health and take steps to improve it. It’s a technology still in its infancy, but it’s going to play a major role in long term care. zucker_263x329.jpg

 

How will technology shape health care in New York and beyond?

 

Zucker: Technology in health and medicine is rapidly expanding – it’s already started. Genomics and proteomics will one day lead to customized medicine and treatments tailored to the individual. Mobile technology will provide patient data to change behaviors. Patients and doctors alike will use this type of technology. As a result, patients will truly begin to “own” their health.

 

Personally, I’d like to see greater use of technology for long-term care. Many people I know are dealing with aging parents and scrambling to figure out what to do. I think technology will enable more people to age in place in ways that have yet to unfold.

 

What hurdles do you see in New York and how can you get around those?

 

Zucker: Interoperability remains an ongoing concern. If computers can’t talk to each other, then this seamless experience will be extremely challenging.

 

We also need doctors to embrace and adopt EHRs. Many of them are still using paper records. But it’s challenging to set up an EHR when you have patients waiting to be seen and so many other clinical care obligations. Somehow, we need to find a way to make the adoption and implementation process less burdensome. Financial incentives alone won’t work.

 

How will mobility play into providing better patient care in New York?

 

Zucker: The human body is constantly giving us information, but only recently have we begun to figure out ways to receive that data using mobile technology. Once we’ve mastered this, we’re going to significantly improve patient care.

 

We already have technology that collects data from phones, and we have sensors that monitor heart rate, activity levels and sleep patterns. More advanced tools will track blood glucose levels, blood oxygen and stress levels.

 

How will New York use all this patient-generated health data?

 

Zucker: We have numerous plans for all this data, but the most important will be using it to better prevent, diagnose and treat disease. Someday soon, the data will help us find early biomarkers of disease, so that we can predict illness well in advance of the onset of symptoms. We will be able to use the data to make more informed decisions on patient care.

Read more >

Delivering on Choice for Hybrid Cloud Customers with Intel & EMC

Stu Goldstein is a Market Development Manager in the Communications and Storage Infrastructure Group at Intel

 

When purchasing a new laptop for my sons as they went off to college, a big part of the brand decision revolved around support and my peace of mind.  Sure enough, one of my sons blew his motherboard when he plugged into an outlet while spending a summer working in China, and the other trashed his display when playing jump rope with his power cord, pulling his PC off the bed. In both cases I came away feeling good about the support received from the brand that I trusted.  

 

So, knowing a bit more about how I think, it should not be a big surprise that I see the EMC Hybrid Cloud announcement today as important. Enterprises moving to Converged Software Defined Storage Infrastructures should have choices.  EMC is offering the Enterprise the opportunity to evolve without abandoning a successfully engineered infrastructure, including the support that will inevitably be needed. The creation of products that maximize existing investments, while providing the necessary path to a secure hybrid cloud is proof of EMC’s commitment to choice.  Providing agility moving forward without short circuiting security and governance can be difficult; EMC’s announcement today recognizes the challenge.  Offering a VMware edition is not surprising; neither is the good news about supporting a Microsoft edition.  However, a commitment to “Fully Engineered OpenStack Solutions,” is a big deal.  Intel is a big contributor to open source including OpenStack, so it is great to see this focus from EMC.

 

EMC has proven over the last several years that they can apply much of the underlying technologies that the Intel® Xeon® processors combined with Intel® Ethernet Converged Network Adapters have to offer.  When Intel provided solutions that increased memory bandwidth by 60% and doubled I/O bandwidth generation over generation, EMC immediately asked, “what’s next?”  Using these performance features coupled with Intel virtualization advances, the VMAX³ and VNX solutions prove EMC is capable of moving Any Data, Anytime, Anywhere while maintaining VMs that are isolated to allow for secure shared tenancy.  Now EMC is intent on proving it is serious about expanding the meaning of Anywhere.  (BTW, the XtremIO Scale Out products are a great example of Anytime, using Intel Architecture advancements to maintain a consistent 99% latency at less than 1 ms to provide the steady performance that customers need to take the most advantage possible of this all flash array).  EMC is in a unique position to offer customers of its products in the enterprise the ability to extend benefits derived from highly optimized deduplication, compression, flash, memory, I/O and virtualization technology into the public cloud.

 

Getting back to support which is a broad term that comes into laser focus when you need it.  It has to come from a trusted source no matter whether your storage scales up, out, is open, sort of open or proprietary.  It costs something whether you rely on open source distros, OEMs or smart people hired to build and support home grown solutions. EMC’s Hybrid Cloud announcement is a recognition that adding IaaS needs backing that covers you inside and out or maybe said another way, from the inside into the outside.  I look forward to seeing what IT Managers do with EMC’s choices and the innovation this initiative brings to the cloud.

Read more >