ADVISOR DETAILS

RECENT BLOG POSTS

Accelerating Time to Insight in the Petroleum Geosciences Industry with Xeon Phi Co-processors

Today DownUnder GeoSolutions (DUG) announced the acquisition of a new HPC system from SGI. DUG and SGI worked closely with Intel while developing the system, which will feature a total of 3,800 Intel® Xeon Phi™ coprocessors. This is one of the largest commercial deployments of Intel® Xeon Phi™ coprocessors, and the largest such deployment intended for use in the petroleum geosciences industry.

“We’ve already started to see dramatic improvements in turn-around times when we compare our upgraded machines to those without coprocessors. Our time migration now runs more than ten times faster. Our depth migration runs six times faster. DUG has also seen its Reverse Time Migration (RTM) run significantly faster using this new technology,” said Dr. Matt Lamont, DUG’s managing director.

Dr. Lamont gives several reasons for choosing the Intel® Xeon Phi™ coprocessors over GPU accelerators. In addition to his top researchers being more familiar with the many core programming model, when asked about the differences in programming difficulty, he stated that for Kirchhoff Time Migration, programming on the GPU was roughly four times more difficult than on the Intel Xeon Phi.

He also stated that overall Total Cost of Ownership, or “bang for the buck,” is a top consideration for DUG, and that the performance, programming environment and end price all play a role in this. For HPC workloads at DownUnder GeoSolutions, consisting of a full suite of seismic processing and imaging algorithms, the Intel® Xeon Phi™ coprocessors were selected as the best option.

“The innovative use of Intel® Xeon Phi™ coprocessors by DownUnder GeoSolutions is enabling their geophysicists to work with large seismic data sets interactively,” said Charles Wuischpard, vice president and general manager of Workstations and HPC at Intel. “In an industry where time is invaluable, the Intel® Xeon Phi™-based SGI system allows DUG to test more and faster, leading to better results in a much shorter period of time. Their integration of Intel® Xeon Phi™ coprocessors has enabled them to quickly adapt their existing code and immediately pass this value on to their customers.”

Visit with Intel and DownUnder GeoSolutions at SEG next week in Denver, booths 1693 (Intel) and 538 (DUG).

Read more >

Mobility In The Financial Services Industry — Right On The Money

Results from a recent survey of IT professionals working in the financial services sector show that a significant percentage of the industry is adopting a mobility strategy. CIOs at wealth management companies need to be prepared to offer solutions that focus on the needs of financial analysts and advisors. The IT leaders in these companies have a unique perspective on the technologies and devices that are capable of alleviating productivity bottlenecks.

 

Financial analysts and wealth managers need a device that supports their fast-paced profession yet still affords them the flexibility to get their work done wherever they are. Many tablets and other mobile devices are loaded with extraneous features, but most wealth managers just want technology that gets them to the numbers without getting in the way. For financial service companies looking to give their team an edge by offering a high-performance mobile device that works quickly, seamlessly, and lets analysts focus on making decisions that benefit clients, the Microsoft Surface Pro 3 is a great investment.

 

Q4-SSG-Blog-1-4.png

Time Equals Money

 

When time is money, loading screens are your enemy. In a recent study, Principled Technologies tested a Microsoft Surface Pro 3, iPad Air, and Samsung Galaxy Note 10.1 side by side to check performance with regards to common tasks wealth managers perform, such as using financial planning software and mutual fund research. The study also tested Microsoft Office and video performance using Microsoft Lync remote meeting software, a service commonly used for real-time collaboration and communication.

 

The difference in load times across the three platforms was substantial. When using the financial planning software MoneyGuide Pro, the Intel-Powered Surface Pro 3 cut waiting times by 40% compared to the iPad Air and Galaxy Note 10.1. Additionally, while performing common mutual funds research on Fund Mojo, the Microsoft Surface Pro 3 outperformed the competition by 45%.

 

Don’t Buy Into Inflated Tablet Hype

 

While the iPad Air and Samsung Galaxy Note 10.1 have gained in popularity, the smart investor knows to dig deeper before making a move. In addition to leaving both tablets in the dust when it comes to research and portfolio management tasks, the Surface Pro 3, featuring a 4th generation Intel Core processor, demolished the competition in productivity time savings. The Surface Pro 3 allows users to access and edit Office documents 76% faster than the competition, and features full Microsoft Office 365 compatibility. The iPad Air and Samsung Galaxy Note 10.1 were only able to perform a fraction of the productivity tasks, and were even further disadvantaged when it came to video conferencing with Microsoft Lync. The Surface Pro 3 was the only device capable of participating in a Lync meeting and accessing all of the software’s features.

 

If you’re looking for a full-featured tablet to give your company’s wealth managers and financial analysts a significant return on investment, look no further than an Intel-powered Microsoft Surface Pro 3. For a full breakdown, check out the Principled Technologies white paper. To join the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter

Read more >

Optimized Data Center Performance – Intel’s Collaboration with Azure

Intel continues to deliver on its commitment to provide our customers with industry leading silicon optimized for their specific needs and workloads.

 

A year ago (Nov 2013), Intel talked about having delivered more than 15 customer specific offerings.  Earlier this year at Structure, Diane Bryant and I  detailed Intel’s plans to accelerate and expand its silicon customization capabilities, including the announcement of a coherent and customizable FPGA product and the Intel Xeon D product family.  Just months later, in September 2014, we demonstrated our commitment to moving faster by announcing that the number of unique customized solutions for Intel Xeon E5 v3 had now grown significantly to more than 35, with the addition of 20 new customer specific solutions.

 

Today, we’re excited to be a part of Microsoft’s announcement of the new Azure “G-series” Virtual Machines: Azure’s most powerful VMs to date, based on a customized Xeon solution from Intel.  The “G-series” VMs deliver significant improvements in platform capability for cloud-based computing by leveraging the latest Intel® Xeon® processor E5-2600 v3 product family, and will target customers that are running large relational databases and big data workloads that require maximum performance and large memory.  Working together to understand their requirements we were able to deliver a SKU optimized specifically to meet Microsoft’s unique performance, power & feature requirements.  This is yet another example of how working together, Intel and Microsoft are delivering cutting edge innovation across the computing spectrum.

 

The Intel Xeon processor E5-2600 v3 product family offers the best combination of performance, built-in capabilities, and efficiency to address performance computing challenges. Compared to previous generations, the Xeon E5 v3 processors delivers up to 3x higher performance. (More info)  Microsoft joined us on stage at the launch of Xeon E5 v3 in September to share their enthusiasm and our collaboration within Azure.

 

 

 

The G-series offers up to 32 cores, 448GB RAM and over 6TB of local SSD space. This new Azure offering delivers an excellent choice for large SQL and Oracle database systems, MongoDB, database systems with high memory requirements or enterprise applications.

 

Let us know what you think about the new G-series Azure offering.  To read more from Microsoft Azure and their G-series, visit microsoft.com

Read more >

Practical Challenges of Healthcare Security

From time to time we will look at healthcare IT environments from around the world to see how different countries approach healthcare technology challenges. Below is the second in a series of guest posts on the English NHS from contributor Colin Jervis.

 

In the UK, an aging population threatens to increase demand for healthcare and social services. My last post looked at the features of the integrated care needed to stem this tide and some of the security and confidentiality issues raised by sharing between organizations. Really, the only answer in the short- and medium-term is better models of care supported by Information and Communications Technology (ICT).

 

In addition, Baby Boomers are now aging and are likely to be far more assertive than their parents about healthcare quality and delivery. And they often have better ICT at home than they encounter in a spell with the NHS.

 

For sure, the management of long-term conditions is likely to be a competitive arena for public and private sector healthcare providers. Even among traditional NHS providers we already see the formation of GP consortia and of secondary care providers hiring salaried GPs to create new organizations.

 

Supporting this are wirelessness and data integration – moving away from traditional institutions and clinics and moving closer to care in a patient’s home. But the great benefits this promises come with risks.

 

The NHS uses two-factor authentication to authorize access to systems that contain confidential patient data – password and smartcard. Something you know and something you have. This is practicable for most NHS staff; however, for some it is not.

 

In a busy emergency department with few end user devices, the time taken for an individual to log out and in to the electronic patient record each time is unbearable. So, what tends to happen is that someone logs in with their smartcard at the start of the day and remains logged in until the end of their shift, letting their colleagues use their access rights. Not what is intended, but difficult to censure when clinicians put addressing patient needs before information governance.

 

Further, clinicians mobile in the community often have issues with security. They can attend a patient at their home and login. Provided access is good and there is continuous interaction between patient, clinician and machine this is fine.

 

However, some clinicians, such as physiotherapists, may have longer interventions away from the machine. To comply with security, the device times out after a few minutes. Logging in again is a pain, not to mention the possibility that – for example – an inquisitive family member could access the unattended machine while the connection is open. In the world of remote access security form does not always follow function.

 

Two-factor authentication is sound, however, many ICT helpdesks will rate the resetting of passwords as the biggest reason for user calls. Passwords are not easy for most people to remember particularly if the structure is prescriptive; for example, at least one capital letter, one digit and one symbol – and also has to be changed regularly.

 

Nothing of nothing comes. With the greater use of ICT and the benefits of instant access and mobility, we must trade something. There is no activity that carries no risk. Even if I lie in bed all day to avoid being run over by a truck or attacked by a mugger, I still risk the disbenefits of inactivity such as depression, heart disease and an overdose of comfort eating.

 

But how important to us is the confidentiality of healthcare information, particularly with the growth of wearable health devices and the smartphone app? I’ll address that in my next post.

 

What questions do you have?

 

Colin Jervis is an independent healthcare consultant. His book ‘Stop Saving the NHS and Start Reinventing It’ is available now. His website is kineticconsulting.co.uk, and he also posts on Twitter @colin_jervis.

Read more >

Transform IT – Episode 3 Recap: The Curvy Path

During the latest episode of the Transform IT show, Patty Hatter, Sr. VP of Operations and CIO at McAfee, challenged us to take what I called, “the curvy path.” To be unafraid of having a career path that doesn’t look like a straight line. But the curvy path can be scary, right? The trick is in how you approach it.

 

Wasn’t it fun to hang out with Patty? What I love about her is that she is a no-nonsense, get-it-done executive who makes big things happen. But she also refuses to accept the status quo, is easy to talk to and she’s just a lot of fun to be with. What a powerful combination.

Screen Shot 2014-10-16 at 2.31.11 PM.png

And as I was talking to her, I couldn’t help but think that her own “curvy path” is a lot of the reason why.


As she explained during the interview, she is able to relate to all of her counterparts because she has been in their shoes, at least in part, at different times in her career. I think that kind of depth and breadth of experience gives you an inner confidence that allows you to drop your guard a bit. I think that inner confidence – and the easy manner it engenders – came through loud and clear when I was talking with Patty.

 

So her challenge to each of us was to be unafraid of our own curvy path. To be willing to step off the safe, straight and narrow career path that most of us have been on, and to be willing to try something completely new and different.

 

It’s scary. It’s risky. But it’s what will give you the depth of experience that you need to have that kind of inner confidence in almost any situation.

 

So how will you step off the safety of the straight path and seek out the less direct, but much more interesting path that will lead you forward? It may be an uncertain future, but by embracing the uncertainty and becoming an intellectual and experiential explorer, you can prepare yourself for whatever that future may hold.

 

So what will it be? What will be your first step off the straight, safe path onto your own “curvy path”?

 

Share that first step with us in the comments below or via Twitter using #TransformIT and #ITChat. Taking that step is a critical decision that will put you on the path to getting some amazing things done at the intersection of IT and business!

 

If you missed Episode 3, you can watch it on demand here.

 

Also, make sure that you tune in on October 28th when I’ll be talking to Frank Wander, former CIO at Guardian Life and Author of the book, Transforming IT Culture. We’ll be discussing the similarities between wine and culture from his own personal wine cellar! You’re not going to want to miss it. You can register for a calendar reminder here.

 

Join the Transform IT conversation anytime using the Twitter hashtags #TransformIT and #ITChat. Don’t forget that you can order my book, “The Quantum Age of IT” for 50% off thanks to the Intel IT Center: http://intel.ly/1pfz4tU

Read more >

How to Improve Hospital Efficiency

 

Efficiency is the goal for streamlined, affordable healthcare. But how do we get there?

 

In the above video, Gabi Daniely, vice president of Stanley Healthcare, talks about the company’s five hospital category solutions and how they can improve the operational efficiency of healthcare facilities.

 

How are you improving your facilities’ efficiency? Watch the clip and let us know what questions you have.

Read more >

Intel at SAP TechEd – Plan Your Schedule Now

SAP TechEd and && d-code Conference, coming to Las Vegas from Oct. 20 to 24, is SAP’s premier event for IT architects, administrators, and developers. With over 1,000 hours of instruction on SAP technologies, plus roll-outs of new tools and live coding InnoJams , this event is designed to help front-line IT pros address the real challenges they face every day. With hands-on training and lots of opportunities for networking and collaboration, this is one event where what happens in Las Vegas definitely won’t stay in Las Vegas!

 

Intel and SAP have shared a rich, innovation-based engineering collaboration for more than 8 years, and the relationship continues to evolve. The two companies worked together very closely during the development of SAP HANA*, the revolutionary in-memory database that powers real-time analytics and business solutions. The Intel® Xeon® Processor E7 Family and SAP HANA were co-optimized for superior performance, optimal reliability, enhanced security, and flexible management. Intel Xeon E7 is SAP HANA’s reference architecture design platform, and is the certified platform of choice for over 160 computing appliances with ten SAP HANA OEMs. Since we’re talking Las Vegas here, you can wager that the rich collaboration between SAP and Intel will continue—both companies are moving forward together with innovations for on-premises and cloud-based HANA platforms, and with solutions for enterprise mobility and the Internet of Things.

 

Intel has a full roster of keynote appearances, sessions, demos and other events at SAP TechEd, so start planning your itinerary now. Don’t miss the Steve Lucas SAP Executive Keynote (5:45pm-7pm, Oct. 20, Venetian Ballroom, Level 2). Steve Lucas, president of SAP Platform Solutions, is always an entertaining and enlightening speaker, and you can bet that he will kick off the show by unveiling some exciting announcements. Shannon Poulin, vice president of the Intel Datacenter Group, will join Steve on stage to discuss the latest in Intel and SAP ongoing collaboration, including news about SAP HANA in the cloud. With subscription-based SAP HANA now available via AWS and Virtustream, customers can try out SAP HANA in the cloud before investing in an on-site scale-up deployment.

 

Check out these technical sessions presented by Intel executives & experts:

  • DEV-114 Better Together (3:15-4:15pm, Oct. 21, Bellini 2103). Join Dietrich Banschbach,Intel director of SAP engineering, and learn how running SAP HANA on Intel Xeon E7 processors delivers up to 80 percent more performance and up to 80 percent lower TCO than alternative RISC architectures.
  • TEC212 Data Center Intelligence – SAP HANA Platform Extension for IT Departments (3:15-4:15pm, Oct 21, Lando 3202). Curt Aubley, Intel Data Center Group VP/CTO, will co-present with Nico Groh, SAP data center intelligence project owner. Learn about the ongoing Intel and SAP engineering effort to optimize SAP HANA power and performance on Intel® architecture and the enablement of Intel® Data Center Manager.
  • EXP17738 Accelerate the Performance of the SAP HANA Platform with Intel Architecture (4:30-5pm, Oct. 21, Lounge 3, Show Floor). Frank Ober, data center solution architect at Intel NV Memory Group, explains how Intel® Solid State Drives change the game for in-memory systems such as SAP HANA, where low-latency and parallel data movement is key.
  • EXP17764 Increase SAP HANA Data Security: Deploying the Vormetric Solution (1:30-2pm, Oct. 22, Lounge 3, Show Floor). Martin Guttmann, principal architect for Intel Data Center Solutions Worldwide, and Sri Sudarsan, director of engineering for Vormetric, highlight the business benefits and functional capabilities of deploying Vormetric data security solutions for data encryption on SAP HANA and Intel Xeon E7 platforms in a cloud-hosted infrastructure.

 

Stop by the Intel booth (#3000) to check out demos by Intel partners such as Fujitsu, SGI, and Lenovo, which showcase scale-up solutions built on the Intel Xeon E7 v2 platform. You’ll have a chance to experience Intel®-based tablets and 2-in-1 devices as secure platforms for SAP HANA mobile apps; meet SAP HANA Data Center Intelligence* (SAP HDCI), which integrates Intel® Data Center Manager with SAP HANA for improved power management; and discover 3D cameras built on Intel® RealSense™ sensory-input technologies.

 

We’ll also host almost 20 half-hour Tech Talks at our booth, so stop by for a chance to hear experts deliver quick overviews of the latest co-innovations from Intel and SAP. I’ll be there to help film the Tech Talks, and – get excited – I’ll feature several of these in an upcoming blog.

 

Follow me at @TimIntel and watch for my Vine videos and man-on-the-street commentary and impressions on SAP TechEd keynotes and Intel sessions. Follow @IntelITcenter and join the dialogue with Intel IT experts, and follow @IntelSoftware to engage with Intel’s software community.  

 

See you in Las Vegas!

Read more >

IT Leadership – How Do I Get There and How Do I Move Up?

As I look back at my career (no it’s not over ), I think on the important lessons I have learned. When I first started in IT, my first two promotions happened without any real involvement by me. I worked hard, did my job and my manager promoted me. I remember thinking this was great, but it was really my boss who was responsible for me being promoted.

 

All of a sudden, I noted that others who worked just as hard, were also getting promoted around me. As a result I wasn’t moving up as quickly as before, comparatively speaking. I began to spend time trying to understand why this was happening. I hadn’t changed anything in what I was doing — I was still working hard, arriving on time and working well with others. So it took me a while to figure it all out.


I saw that these newly promoted individuals were taking an active role in their careers by seeking out new opportunities and new ways to demonstrate their skills to a wider audience. They were taking on projects that others didn’t want and delivering results.

 

I was not doing that.


Truthfully, the thought had never even occurred to me. To reach out and ask for work that was not inherently mine wasn’t something that I intuitively pursued.


IT LEadership.jpg

From this realization, I started to look for these opportunities. I viewed it as a way for me to expand my knowledge and demonstrate the work I knew I could perform. Taking the time to meet with others, I focused on how I could help my surrounding colleagues and managers, and just as important, how they could help me. In this way, I connected with people that provided me with mentorship and guidance throughout my career.

 

The hard lesson that I ultimately learned was that my career was my own responsibility. I had to take an active role by seizing opportunities. It wouldn’t be in my interest to wait around and play the selection game. I couldn’t expect for things to just happen.

 

For me, this change came about when I took the initiative to take on the projects that no one else wanted — the assignments that came with no fanfare. However, these menial tasks were still key to actual delivery, albeit their success was not easy to measure. In such cases, failure was definitely an option. But while I thought that failure would mean early termination from the company, the truth was that it was only through failure that I was able to learn so much so quickly. As long as corporate policies were followed and we learned something during the process, our “failures” on a project would never be the cause of getting fired.     

 

As I’ve worked over the years, I have come to a profound discovery regarding career promotion. When you start to climb the ladder, your boss is the one that promotes you. But as you reach the middle rungs of the corporate hierarchy, it’s actually your peers that promote you. And as you get closer to the upper reaches of executive level leadership, it is the peers in your specific industry or executives outside your current path that are the ones that move you up the ladder.

 

More often than not, this happens much sooner if you get directly involved rather than simply being in the right place at the right time. 

 

Good luck with the climb and connect with me on Twitter to let me know what you’ve learned along the way.

Read more >

Humanity Is the Heart of the IT Revolution

The CIO of today can no longer focus on just technology.

 

Our world is shaping itself more and more around tech every single day. The enterprise has been feeling the tug of consumerization, the strain of mobility, the continuous development of the Internet of Things for years now, and CIOs are tackling problems greater than ever before. Users are demanding more convenience in spite of the rise of corresponding threats, as is the rest of the C-suite. So while an IT decision maker was once well-versed in technology and removed from the business, that’s no longer the case.

 

27085368_l.jpg

Since tech is now a tremendous business driver, IT is more about the human needs shaping tech-oriented business decisions. Enterprises hiring new CIOs are looking for resume experience that reflects soft skills and business acumen. Leaders bearing a cross-disciplinary background are of greater value to both business and customer, and IT decision makers are starting to take notice.

 

Communication Breakdown

 

Erika Van Noort, director of consulting at Softchoice, recently told CIO.com, “Our theory is that within leadership roles, folks have to understand the entire business so they can better serve customers — both external and the internal customers, users, that IT supports. Our external clients are facing skills shortages not with technology and certifications, but with business skills and seeing the larger business strategy.”

 

As the innovation engine continues to toss new disruptors into the enterprise, a CIO has to be able to make a business case for the changes happening. Social, mobile, analytics, and cloud will continue to mold and shape the way tech fuses with business, and an IT decision maker is tasked with catering to the customer while still satisfying the business.  So it’s necessary to learn how to communicate with and understand the needs of each business unit that relies on tech.

 

Listen to and Learn From Your Users

 

Here at Intel, we’ve designed The Way We Work program, which aims to provide workstations better catered to the needs of employees. Our reasoning was to acknowledge that we, as humans, work better when in a happy environment. Unhappy work conditions can often give way to counter productivity. Improvements have ranged from digital whiteboards in meeting rooms to communal workspaces to wireless video conferencing equipment. And one day, digital voice transcription and location-based sensors that allow users to find coworkers. Although it was a costly initial investment, the return seen through greater employee productivity has been undeniable.

 

The ideal is to let your users guide your strategy. IT is all about customer service, and our customers have changed. So put on your listening ears, strap on your CEO hat, and be ready to learn.

 

To continue this conversation, please follow us at @IntelITCenter or use #ITCenter.

Read more >

A Revitalized Argonne Returns to Compete at SC14 Competition

Mike Bernhardt is the Community Evangelist for Intel’s Technical Computing Group

 

Argonne National Laboratory’s rich legacy of pursuing fundamental and applied science, and engineering has led the lab to develop a world-class computational center that supports more than 800 active users and over 120 active projects from universities, national laboratories, and industry.

Last year Associate Laboratory Director Rick Stevens led the Argonne Argonauts in the inaugural Intel Parallel Universe Computing Challenge (PUCC). This year he has passed the reins on to Kalyan “Kumar” Kumaran, manager of Performance Engineering and Data Analytics in the Argonne Leadership Computing Facility (ACLF).

We managed to get a few moments of Kumar’s time in the midst of a hectic schedule and DOE audits to answer a few question about the 2014 Argonne team which he has christened “Linear Scalers” in hope that the new name will give them better luck than last year when they were eliminated in the first round of competition.

 

The Argonne Linear Scalers include (L to R) Vitali Morozov, Performance Engineering, ALCF; Kumar Kumaran; Kevin Harms, Performance Engineering, ALCF, and Tim Williams – Computational Scientist, ALCF. Not pictured is Hal Finkel, Computational Scientist, ALCF

 

Q. Rick Stevens was the team captain of last year’s team from Argonne called the Argonauts. He’s recruited you to fill that role this year. What did he tell you about the competition to convince you to take the lead?

A. Not much. Other than that he enjoyed the experience and Argonne should definitely take part this year. Also the Argonauts were not too lucky, so we changed the name and will return as the new and revitalized Linear Scalers.

 

Q. How do you think this competition will help others to understand the value of modernizing their code?

A. Developers will quickly notice how their portable code can be made Intel specific, but will run 1,000 times faster!

Q. How will your team prepare for this year’s challenge?

A. Pre-competition stretching, and coffee. We will start memorizing sections from James Reinders’ books.

Q. SC14 is using the theme “HPC Matters” for the conference. Can you explain why “HPC Matters” to you?

A. HPC matters because no single modern technology has had such an impact on such a wide range of research activities. The U.S. Department of Energy has a long history of building user facilities in support of science, but computing moved front and center a decade ago with the creation of the Leadership Computing Facility (LCF). The science being done in LCF centers is changing the world– producing better airplanes, accelerating discoveries of disease-fighting drugs, and designing better materials for everything from computer chips to new ways to store energy. All these advancements come from better simulation science, better codes, and better and faster HPC systems.

Read more >

The Quiet Transformation of Internal Communications

31301415_l.jpg
If you are a communications professional, a project manager or an org leader – you’ve probably already found out by now that your social collaboration platform is changing the way you work at a very fundamental level. In addition to being a ‘communicator’ – you are now a blogger, a curator, a viral marketer, a librarian ,etc. Your responsibilities and the skills required to be successful look vastly different than they did a few years ago.

 

Three key shifts explain the transformation in internal corporate communications:

 

1. From ‘Communication’ to ‘Conversations’


Traditional communication tools enabled you to inform your audience about a change, but didn’t offer much to engage them in a discussion. If you use newsletters or web-mailers, you need to closely manage your mailing list.  Currently there are no means to determine if an email or virtual message has been filtered, deleted, or even worse, unsent to specific people. Click-through statistics might give you a rough idea on the effectiveness of a content – but there is very little feedback on how the audience actually responded to a message.

 

Your social platforms can offer a fresh new way to bridge this engagement gap. Something as mundane as an org announcement can evoke feedback (likes, shares, congratulatory messages!)

 

2. From ‘Communicator’ to ‘Curator’

 

If you manage communications for an organizational unit, you need to stay on top of the trending discussions and blogs written by employees. It is key to remember that not all content on the community site needs to be written by professional communicators or org leaders. You will find noteworthy content emerge from across the organization – and your job is to curate, and bubble up the best.

 

Tap into what employees are saying: in their blogs, in discussions and in smaller teams. Highlight the right conversations that add value to the discussion and give them visibility on your community page. Promote diversity of opinion and support your organization’s efforts in ensuring that all voices are heard.

 

3. From ‘Newsletters’ (publisher’s push) to ‘Newsfeeds’ (consumer’s pull)

 

This is by far the biggest change that you need to deal with and embrace when you adopt the enterprise social network for business communication.

 

When the newsletter was the tool of choice, you, as a communicator, were empowered to ‘push’ content to recipients that you had personally identified and chosen. In social communication, the paradigm shifts. The consumer now decides what content to follow and when to view it.

 

If your organization chooses to make your social platform the primary communication vehicle, you need to use traditional channels (web mailers, website etc) to invite org members to ‘follow’ your community. Monitor the count of followers, and reinforce the ‘get in or get left out’ message with the primary target audience. Eliminate willful ignorance. Deliberately ignoring the subscribe button is no excuse to plead ignorance about the information.

 

Once you hit the enrollment numbers, you will start seeing the benefits of the ‘pull’ model. You will get very “real” feedback on readership, ‘likes’, ‘shares’ and of course, ‘comments’. Your content could ‘go viral’ when primary readers share with their extended network. Over time, you will get a much better pulse on content consumption patterns than you might not have had with past tools.

 

I feel it is a particularly exciting time to be an internal business communicator. The cornerstones of communication strategy include: content, audience and channel. The social communication channel can bring about connectivity and engagement via human interactions like never before. All the best.

 

To continue the conversation, would love to hear your insights in the comment below.

Read more >

How to Conduct 264 Years of Research in 18 Hours

 

In the above video, Cycle Computing CEO Jason Stowe talks about the strong disconnect that exists between research and clinical analysis. He says the current challenge in bio IT is to analyze data, make sense of it, and do actionable science against it.

 

He shares an example of a 156,000-core workload run in eight regions of the globe that produced 2.3 million hours of computational chemistry research (264 years’ worth) in just 18 hours. He says this capability will transform both access patterns and the kinds of research that pharmaceutical, life sciences, and healthcare companies are able to tackle when it comes to analyzing genomes.

 

Watch the clip and let us know what you think. What questions about research and clinical analysis do you have?

Read more >

Fostering a Culture Of (Device) Compatibility

Mobility and device freedom are becoming huge value adds for businesses that seek to offer more flexibility to their employees. As the movement gains traction, it’s creating numerous challenges for enterprise IT leaders. Security and maintenance are primary concerns for most BYOD strategies, but there are other aspects that, if left unaddressed, could nullify the intended productivity benefits.

 

One of the biggest enemies of productivity and a streamlined workflow is content decay. Content decay may occur when opening a document (such as a Microsoft Word or PowerPoint file) on a device running a different operating system than it was created on. For example, in recent tests performed by Prowess Consulting, Microsoft Excel spreadsheets opened on a Samsung Galaxy Note 10.1 using a third-party productivity software platform resulted in lost data, and secure documents were unlocked.

 

The video below illustrates additional ways in which content decay can derail productivity and expose sensitive information.

 

 

 

Developing a Zero Tolerance Policy For Content Decay

 

Content decay is a real concern for businesses, and the effects are compounded by the size of your workforce. The good news is that it’s largely avoidable. By developing a zero tolerance content decay policy, you can mitigate lost productivity and increase the security of your locked files. The best way to combat content decay is through device compatibility. By providing your employees devices that are designed to work together natively, you can ensure better business outcomes. In the Prowess study, the two devices that experienced no content decay when opening Microsoft Office documents were the Intel-powered HP ElitePad 1000 G2 and HP EliteBook Revolve 810 G2. These two devices offer the power and flexibility companies need in their mobility strategies, and also run Microsoft Windows natively.

 

For more information on how you can avoid content decay at your company, check out the full Prowess Consulting white paper. To join the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter

Read more >

Finding your new Intel SSD for PCIe (think NVMe, not SCSI)

Sometimes we see customers on Linux wondering where their new NVMe capable SSD is on the Linux filesystem. It’s not in the standard place on the Linux filesystem in ‘/dev/sd*’ like all those scsi devices of the past 20+ years. So how come, where is it? For all of you new to the latest shipping Intel SSD’s for PCIe, they run on the NVMe storage controller protocol, and not the scsi protocol. That’s actually a big deal because that means efficiency and a protocol appropriate for “non-volatile memories” (NVM). Our newest P3700 and related drives will use the same, industry standard, and open source NVMe kernel driver. This driver drives I/O to the device and is part of the block driver subsystem of the linux kernel.


So maybe it is time to refresh on some not too familiar or oft-used linux administrative commands to see a bit more. The simple part is to look in “/dev/nvme*”. The devices will be numbered and the actual block device will have an n1 on the end, to support NVMe namespaces. So if you have one PCIe card or front-loading 2.5″ drive, you’ll have /dev/nvme0n1 as a block device to format, partition and use.


These important Data Center Linux distributions:

Red Hat 6.5/7.0

SUSE 11 SP2

Ubuntu 14.04 LTS


…all have in box nvme storage drivers, so you should be set if you are at these levels or newer.


Below are some basic Linux instructions and snapshots to give you a bit more depth. This is Red Hat/CentOS 6.5 distro relevant data below.


#1

Are the drives in my system scan the pci and block devices:

[root@fm21vorc10 ~]$ lspci | grep 0953

04:00.0 Non-Volatile memory controller: Intel Corporation Device 0953 (rev 01)

05:00.0 Non-Volatile memory controller: Intel Corporation Device 0953 (rev 01)

48:00.0 Non-Volatile memory controller: Intel Corporation Device 0953 (rev 01)

49:00.0 Non-Volatile memory controller: Intel Corporation Device 0953 (rev 01)

 

[root@fm21vorc07 ~]# lsblk

NAME        MAJ:MIN RM  SIZE RO TYPE MOUNTPOINT

sda 8:0    0  372G  0 disk

─sda1 8:1    0    10G  0 part /boot

─sda2 8:2    0  128G  0 part [SWAP]

└─sda3        8:3 0  234G  0 part /

nvme0n1    259:0    0 372.6G  0 disk

└─nvme0n1p1 259:1    0 372.6G  0 part

#2

Is the nvme driver built into my kernel:

[root@fm21vorc10 ~]$ modinfo nvme

filename: /lib/modules/3.15.0-rc4/kernel/drivers/block/nvme.ko

version:        0.9

license:        GPL

author:        Matthew Wilcox <willy@linux.intel.com>

srcversion:    4563536D4432693E6630AE3

alias: pci:v*d*sv*sd*bc01sc08i02*

depends:

intree:        Y

vermagic:      3.15.0-rc4 SMP mod_unload modversions

parm: io_timeout:timeout in seconds for I/O (byte)

parm: nvme_major:int

parm: use_threaded_interrupts:int

 

#3

Is my driver actually loaded into the kernel

[root@fm21vorc10 ~]$ lsmod | grep nvm

nvme 54197  0

 

#4

Are my nvme block devices present:

[root@fm21vorc10 ~]$ ll /dev/nvme*n1

brw-rw—- 1 root disk 259, 0 Oct  8 21:05 /dev/nvme0n1

brw-rw—- 1 root disk 259, 1 Sep 25 17:08 /dev/nvme1n1

brw-rw—- 1 root disk 259, 2 Sep 25 17:08 /dev/nvme2n1

brw-rw—- 1 root disk 259, 3 Sep 25 17:08 /dev/nvme3n1

 

#5

Run a quick test to see if you have a GB/s class SSD to have fun with.

[root@fm21vorc07 ~]# hdparm -tT –direct /dev/nvme0n1

 

/dev/nvme0n1:

Timing O_DIRECT cached reads:  3736 MB in  2.00 seconds = 1869.12 MB/sec

Timing O_DIRECT disk reads: 5542 MB in  3.00 seconds = 1847.30 MB/sec


Remember to consolidate and create parallelism as much as possible in your workloads.These drives will amaze you.


Have fun!


Read more >

How IT killed the auto insurance market

Automobiles are becoming smart.  And the more that IT is implemented into vehicles, the more car insurance companies will need to worry.  


Recently, reports and studies of “driverless vehicles” have sparked public interest while encouraging the development and integration of smart technology in vehicles.  Today, we have cars and trucks that are not only able to drive themselves, but they can now talk to one another.

Hello Megatron!

crash.jpgWhile this technology becomes more and more prevalent in the public market, there will be a major increase in self-driving cars.  As a result, many of the everyday driving risks will disappear. Let’s imagine for a second… Speed limits will no longer be broken. Traffic jams will no longer occur. Road rage will not exist. Drowsy drivers can now take naps as their vehicles take them safely to their destination.  Having lunch in the car, which was once limited to a cheeseburger in one hand and a soda between the legs, can now consist of a good bowl of soup with the use of a spoon – clearly a two-handed operation.

Want to use your cell phone by dialing or texting?  Go ahead.  Applying makeup? No problem.  Teenage drivers? A ok. 

 

With an actual driver no longer being required, age restrictions for licenses will not be necessary.  In fact, licenses themselves will no longer be necessary.  In essence, the car becomes a device much like a smartphone or tablet. 

The best news of all: no more auto insurance needed.  With the elimination of human error, bodily injuries and accidents – what will we need to be covered for? Simply put, auto insurance companies will no longer be in business.

What does that mean for us?  No more commercials featuring Geckos, Flo or Cavemen.


Well that’s just better news.  One can only dream right?

 

Doc

 

Read more >

Bringing Electronic Checklists to Healthcare

Doctors and surgeons are some of the brightest individuals in the world. However, no one is immune to mistakes and simple oversights. Unintentional errors occur in any industry; what makes healthcare different is that a single misstep could cost a life. 

 

In, The Checklist Manifesto by Dr. Atul Gawande, he cites a fellow surgeon’s story of a seemingly routine stab wound.  The patient was at a costume party when he got into an altercation that led to the stabbing.  As the team prepared to treat the wound, the patient’s vitals began dropping rapidly. The surgeon and his team were unaware that the weapon was a bayonet that went more than a foot through the man, piecing his aorta.

 

After regaining control of the situation, the man recovered after a few days. This experience presented complications that no one could possibly predict unless the doctors had full knowledge of the situation.  Gawande states, “everyone involved got almost every step right […] except no one remembered to ask the patient or the medical technicians what the weapon was” (Gawande 3). There are many independent variables to account for; a standard checklist for incoming stab wound patients could ensure that episodes like this are avoided and that other red flags would be accounted for. 

 

Miscommunication between clinicians and patients annually accounts for roughly 800,000 deaths in the US, more than heart disease and more than cancer.  The healthcare industry spends roughly $8 billion on extended care as a result of clinical error every year. As accountable care continues to make progress, the healthcare industry is moving more towards evidence based medicine and best practices. This is certainly the case for care providers, but also for patients as well. 

 

Implementing checklists in all aspects of healthcare can eliminate simple mistakes and common oversights by medical professionals and empower patients to become more educated and informed. Studies by the Journal of the American Medical Association (JAMA) as well as the New England Journal of Medicine (NEJM) have concluded that implementing checklists in various facets of care can reduce errors by up to half. Certain implementations of checklists in Intensive Care Units for infection mitigation resulted in reducing infections by 100 percent.

 

Compelling evidence of the need for checklisting can be found in the preparation process for a colonoscopy.  Colonoscopy preparation is a rigorous process that requires patients to be watching their diet and the clock for two days before procedure.  It is not uncommon for a colonoscopy to fail due to inadequate patient preparation. Before the procedure, the patient must pay attention to an arsenal of instructions regarding food, liquid, and medication. A detailed checklist that guides each patient through the process would practically eliminate any errors and failures due to inadequate patient preparation. 

 

From the patient’s perspective, checklisting everything from pre-surgery preparation to a routine checkup should be a priority.   At the end of the day, the patient has the most at stake and should be entitled to a clear, user-friendly system to understand every last detail of any procedure or treatment.

 

A couple of companies are making waves in the area of patient safety checklists, most notably of which are BluMenlo and Parallax.

 

BluMenlo is a mobile patient safety firm founded in 2012. Its desktop, tablet, and mobile solution drives utilization of checklists for patient handoffs, infection mitigation, and Radiation Oncology Machine QA. Although initial focus is in the areas mentioned, BluMenlo is expanding into standardizing best practices hospital and ACO-wide.

 

Parallax specializes in operating room patient safety. Its CHaRM offering incorporates a Heads Up Display to leverage checklists in the Operating Room. The software learns a surgeon’s habits and techniques to accurately predict how long an operation may take as well as predict possible errors.

 

Electronic checklists will certainly take hold as health systems, ACOs and accountable care networks continue to focus on increased patient safety, improved provider communications and best practices for reducing costs across their organizations. We will even see these best practices expedited if we begin to inquire with our care providers as informed and engaged patients.

 

What questions about checklists do you have?

 

As a healthcare executive and strategist, Justin Barnes is an industry and technology advisor who also serves as an Entrepreneur-in-Residence at Georgia Tech’s Advanced Technology Development Center. In addition, Mr. Barnes is Chairman Emeritus of the HIMSS EHR Association as well as Co-Chairman of the Accountable Care Community of Practice.

Read more >

What Is Business Intelligence?

What Is Business Intelligence?Early in my career, I was encouraged to always ask questions, even the  most obvious and simple ones. This included questions about well-known topics that were assumed to be understood by everyone. With that in mind, let’s answer the question, “What is business intelligence (BI)?”

 

As you read this post, you probably fall into one of these three categories:

  1. You know exactly what BI is because you eat, sleep, and breathe it every day. BI is in your business DNA.
  2. The term means nothing more than the name of an exotic tech cocktail that might have pierced your ears, figuratively speaking of course.
  3. You‘re somewhere in between the two extremes. You’ve been exposed to the term, but haven’t had a chance yet to fully digest it or appreciate it.

 

Do you have something to learn about BI? Let’s roll up our sleeves and get to work.

 

To begin with, BI looked very different when I started my career in the early ’90s. You couldn’t look it up on a mobile device smaller than a floppy disk. Moreover, you couldn’t Google it, Bing it, or Yahoo it. Today, the keywords “business” and “intelligence” together return more than 250 million results on Google, though few will be relevant to you, nor will you have time to go through them. Nevertheless, the ease and the speed at which you are able to query large volumes of recorded data to make faster, better-informed conclusions puts the question at hand in perspective.

 

Scratching the surface

 

Beginners to BI should start their research with the definition. Wikipedia’s definition of BI is a good place to start, and from it you get the sense that BI includes tangibles such as hardware and software as well as intangibles such as people, culture, processes, and best practices. Continuing on the Wikipedia page, you can find out about the origins of the term. In 1958, Hans Peter Luhn, an IBM researcher, defined the term as “the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal.” By the ‘90s, the term had become more widespread. At CIO.com, BI is defined as “an umbrella term that refers to a variety of software applications used to analyze an organization’s raw data.”

 

Digging deeper

 

Next, you can dig a little deeper by performing what I call a rapid-research exercise to glance at the websites of BI companies that develop the technology. In this way, your searches can transition from text-based and definition-centric explanations to visually rich and appealing presentations, including graphs and charts. This is where BI dashboards take center stage. Not surprisingly, the emphasis on mobile that showcases tablets and smart phones becomes apparent by pictures of BI artifacts shown on mobile devices. Additional references pop up for Big Data and Cloud. Both are hot technology terms that have gained popularity in the last few years. As you research and connect the dots, you can start to build your own definition of BI. This will be influenced by your own unique background, your experiences with technology (with or without BI), and possibly, your personal perceptions layered with your biases of BI. However, in the end, your definition may still fall short.

 

Hitting the core

 

Ultimately, BI is about decision making. In its simplest and purest form, I define BI as the framework that enables organizations of all sizes to make faster, better-informed business decisions.


I don’t claim that this particular definition of BI is better or more comprehensive than others. But it does provide a direct and concise answer with less emphasis on technology and more focus on business, people, and decision making.

 

When it comes to defining BI or technology in general, we need to put the focus on business and people more often. In this context, business decisions should be complemented by technology that promotes actionable insight, and not the other way around. BI is not a miracle pill.

 

BI alone does not solve business problems or cure corporate infections. Instead, BI is the enabler that, if designed, implemented, and executed effectively, can help organizations drive growth and profitability.

 

What is your definition of BI?

 

Connect with me on Twitter (@KaanTurnali) and LinkedIn.

 

This story originally appeared on the The Decision Factor.

Read more >

Next Generation of CIOs Drive a New Style of Business

“The worst place to be as a CIO is to convince yourself you have control, when in fact you don’t,” says Intel CIO, Kim Stevenson in this interview on ComputerWeekly.com.  Stevenson hates the term Shadow IT – she views this as the enterprise at large becoming more educated about technology just like we continue to do in our personal lives.  Stevenson’s outlook symbolizes a new style of interaction with business stakeholders that is vital for competitive enterprises of the future.  It is no longer just about how CIOs help their stakeholders achieve their business objectives – it is also about the manner in which they present solutions in business terms. CIOs of tomorrow must drive a New Style of Business today across the enterprise.  Let us see what we can learn from the next generation of CIOs like Intel’s Stevenson.

 

Kim Stevenson.jpg

Twentieth Century Fox Executive CP and CIO, John Herbert, introduced the term Journey Management at HP Discover.  By realizing business gains for his stakeholders through clearly defined metrics, Herbert is delivering Enterprise IT at the pace of Business.  Through Herbert’s words, Enterprise IT at Fox is a “Service Broker” today instead of an order-taker.  This enables business functions that matter most to his stakeholders.

 

In this CIO.com interview, HP Enterprise Services CIO, Steve Bandrowczak calls out a powerful but rarely mentioned quality for the New Style of CIOs: humility.  The humble CIO will emphasize his people’s importance more than his own. It is the same mindset that drove leaders like Gandhi, Lincoln and Mother Teresa to make big data matter and make a difference in the global enterprise.

 

This mindset drives a spirit of co-opetition rather than competition with other stakeholders.  No wonder Stevenson suggests that CIOs who have worked in a control style of IT service must relinquish control in situations where IT cannot add any value.

 

Stevenson also shares an example of presenting IT solution in business terms.  Rather than letting business peers know that you have a team of Data Scientists who can work magic, she suggests: “How about if you say, ‘We can create a $10m return on investment in six months?’”. This approach was applied to the Resller SMART project. Her team used advanced analytics to provide insights about which customers were most likely to buy. The project delivered $20M in one year.

 

These are powerful messages from CIOs who integrate the business of IT every day. What is interesting is that they are still operating under the fundamental premise of Enterprise IT, enabling the business units to achieve their business objectives.  There is nothing intrinsically new about this premise. But, they are doing this with a different style of thinking and interaction that characterizes a New Style of Leadership to drive a New Style of Business.

 

How about you?  What are other characteristics you would suggest to drive this new style of business?  .

 

Team up with HP Technology Expert, E.G.Nadhan

 

Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog


References:


Read more >