ADVISOR DETAILS

RECENT BLOG POSTS

2014 Wrap-Up: What’s New, What’s Next?

2014.jpgStrategic IT decision-making was a big focus in 2014.


Ideas like integration, innovation, and accessibility were all hot topics for CIOs. While 2014 brought a barrage of enterprise-based technology issues to the forefront—mobility, security, IoT, and big data—the final takeaway was the “how” rather than the “what.”


How are CIOs going to sift through the noise and make careful, deliberate decisions regarding the implementation of new IT strategies?

 

As we look ahead to 2015, we also retrospectively look back at some of the important IT Peer Network conversations of 2014:

 

Pay vs. Passion in the Age of Data Science

 

In 2014, analytics cemented itself as a key aspect to develop within business strategy.  It truly went from a “nice to have” to a “need to have” strategic component.  With this increased awareness of its importance, analytics has also caused some issues as the industry is currently experiencing growing pains.  For example, the supply of qualified data scientists, who analyze and interpret the data, is still catching up to the growing business demand of having a more descriptive set of analytics.  IT Managers and CIOs should be wary of individuals attracted to the data scientist career path.  Many have pursued the profession solely for a bigger paycheck and a presumption that they will get hired quickly.  Michael Cavaretta, data scientist and manager at Ford Motor Company, recommends ways employers can find data scientists who want to cultivate their passion and add value to an organization:

 

“Some have proposed that we look at individual’s activities beyond formal education; participation in Data Science contests like Kaggle, TopCoder, or InnoCentive, or volunteer organizations like Data for Good, DataKind, or Code for America. Joining a Data Science oriented MOOC is also mentioned as a way to measure an individual’s passion for the field. Completion of a relevant, and well-regarded, MOOC is a good signal, but completion rates average less than 10%. But for individuals that don’t have formal Data Science training; people looking for a career change, or holding degrees without significant computer programming and/or statistics requirements MOOCs can be a valuable, and many times inexpensive, alternative.”

 

Strategic Leadership for Managing Evolving Cybersecurity Risks

 

As we’ve seen in recent current events, security will continue to be a hot button concern into 2015. With internal and external security breaches on the rise, Matthew Rosenquist, cyber security strategist at Intel, discusses the challenges and best practices of establishing a sound security structure:

 

“Cybersecurity is difficult. It is a serious endeavor which strives to find a balance in managing the security of computing capabilities to protect the technology which connects and enriches the lives of everyone.  Characteristics of cyber risk have matured and expanded on the successes of technology innovation, integration, and adoption.  It is no longer a game of tactics, but rather a professional discipline, continuous in nature, where to be effective strategic leadership must establish effective and efficient structures for evolving controls to sustain an optimal level of security.”

 

IT Leadership – How Do I Get There and How Do I Move Up?

 

There are many great technologists, but in order to take on the role of IT manager and be a true partner in the business, taking initiative is highly encouraged. Edward Goldman, CTO, enterprise segment at Intel, states that by taking charge and being an active participant an individual can flourish and become a very effective IT leader:

 

“As I’ve worked over the years, I have come to a profound discovery regarding career promotion. When you start to climb the ladder, your boss is the one that promotes you. But as you reach the middle rungs of the corporate hierarchy, it’s actually your peers that promote you. And as you get closer to the upper reaches of executive level leadership, it is the peers in your specific industry or executives outside your current path that are the ones that move you up the ladder. More often than not, this happens much sooner if you get directly involved rather than simply being in the right place at the right time.”

 

German National Football Team Uses Real-Time Analytics for a Competitive Edge

 

The uses for analytics in professional sports is endless.  In an industry where statistics and data have ruled for years, the technology for analytics in sports is booming.  R. Paul Crawford, data scientist at Intel, discusses why more and more professional sports teams are leveraging analytics to find a competitive edge and optimize talent:

 

“Like many other businesses, the sports and entertainment industry is starting to test and operationalize big data analytics. The annual MIT Sloan* Sports Analytics Conference has seized the mantle as the place to (at least publicly) talk about sports analytics. Optimized, personal training regimens, as well as perspective on overall team performance, enable teams to make the best use of their sometimes fantastically expensive talent investments…”

 

Information Technology is a huge investment for most businesses.  Choosing the right IT platform and strategy will be key for the enterprise in the coming year. According to a study by IDG Enterprise, Computerworld Forecast 2015, spending is expected to rise in 2015 – with a focus on security, cloud integration, and business analytics. The mobility and IoT space will see increased spending, too, as products like the Edison platform—Intel’s small, inexpensive, powerful processor—continue to allow entrepreneurs and designers room to innovate.

 

Thanks so much for being a part of our community this year. We can’t wait to see you back in the IT Peer Network in 2015!

 

To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

 

 

Read more >

How Can HPC Assist Mainstream Businesses?

hpcassistbusiness.PNGHigh-Performance Computing (HPC) isn’t just for high end corporations and large scientific organisations. The cost of processing, coupled with the raw power of today’s servers means that small and mid-sized businesses can also benefit from the advanced simulation that HPC provides.

 

Simulation can assist with many elements of product design, says Stephan Gillich, Director of Technical Computing for the Intel EMEA Datacenter Group.

 

It’s particularly useful for computer-aided engineering e.g. in classical fields like mechanics and fluid dynamics, but also in finance life sciences and digital content creation, says Gillich.

 

The key point is that simulation is no longer just in areas where it has already been used for a long time, such as the aerodynamic design of planes. Now, you’ll find it in other product design areas to determine, for example, what happens to the components of a mobile phone when it hits the ground accidentally.

 

“Mainstream businesses can now access simulating compute capacity on HPC clusters more easily and at a very reasonable cost,” says Gillich. This is enabling them to go beyond the limitations of the workstations they currently use.

 

The small automotive supply engineering house Dörrer + Broßmann carried out a proof-of-concept to see how a cluster based on the Intel® Xeon® processor E5 product family can enable it to carry out more-sophisticated engineering simulations more quickly.

 

The more precise simulation services also open up opportunities for Dörrer + Broßmann to pitch for business that was previously too compute-intensive for the company to carry out.

 

From a technology perspective, the server platform has improved, offering more powerful processors with more cores that are capable of handling more data in one operation, for example using Advanced Vector Extensions.

 

Besides the processor, sophisticated network solutions such as Intel® TrueScale, and improved storage solution using Intel SSD, provide additional big improvements. On top of this, the software can be optimised so that processing is parallelised and makes the most of available compute cycles. The result is that we see the potential and benefits of simulation on a scale never seen before by SMEs: a “democratization” of HPC, comments Gillich.

 

Sectors such as life science are growing as businesses – as well as scientists – take hold of the opportunities that HPC offers them.

 

Technical computing Cloud technologies promise flexible resources with on demand computing, democratizing even further.

 

However, in order to deliver customers a high-performance experience without the complexity, there needs to be integration. “Basically people need the sort of interface they’re used to on a single workstation,” says Gillich.

 

The main business benefits of HPC simulation are these:

  • firms have the opportunity to increase their competitiveness
  • bring better products to market faster.


They can also cut product design costs, for example by simulating drop tests, product breakages, or the effect of water or pressure damage. There are cost-effective, entry-level HPC packages supported by hardware and software vendors such as ANSYS, Altair, and others, and new clusters are easier to set up.

 

Data centre and server management has also come a long way, with today’s servers, and cloud-based Technical Computing solutions offering flexibility for changing workloads, and faster time to setup.

 

Have you considered moving from workstation to HPC computing?

 

- Arif

Read more >

8 Things You Should Know About Windows 10

Microsoft unveiled its next Windows operating system, Windows 10, at the end of September 2014. The forthcoming operating system has features specifically designed for business, including an updated user experience and enhanced security and management capabilities.

 

8thingswindows10.PNG

Here are eight things businesses should know about Windows 10:

 

1. Key dates

Microsoft released an early technical preview for laptop and desktop version of Windows 10 on October 1, 2014. This is just over three years after Microsoft unveiled the first public beta build of Windows 8. Microsoft also released its Windows insider program on October 1, designed to keep early adopters up-to-date with the latest preview builds on Windows 10. Then, from October 7, the preview build was available to Windows 7 users as well. (However, consumer preview builds will not be available until early next year.) The technical preview, ends on April 15, 2015, timed to coincide with Microsoft’s Build 2015 conference. At the conference, Microsoft is likely to issue a release date for Windows 10. The company has promised that Windows 10 will ship to consumers and enterprises “later in the year” 2015.

 

On October 13, Microsoft announced that over 1 million people are currently testing the Windows 10 technical preview. This is likely to include a number of enterprises planning for the future.

 

2. Multi-device platform

According to Microsoft, Windows 10 will continue to follow Microsoft’s strategy of making its operating system a platform that is suitable for use on multiple devices. The company describes it as a converged application platform for developers on all devices. Consequently, developers will be able to write an application once and deploy it easily across multiple device types, says Microsoft. This includes desktop PCs, smart phones, tablets and Xbox consoles. Microsoft adds, “Windows 10 will run across the broadest range of devices ever from the Internet of Things to enterprise data centres worldwide.”

 

3. New security features

Microsoft is continuing to focus on adding enterprise security features to its operating system. Windows 10 will feature identity and information protection technology. The new operating system will also have new features around user identities, to improve resistance to breach, theft or phishing. Windows 10 will additionally help advance data loss prevention by using containers and data separation at the application and file level, enabling protection that follows the data as it goes from a tablet or PC to a USB drive, email or the cloud.

 

4. Device management changes

With Windows 10, management and deployment have been simplified to help lower costs. Microsoft says it will offer in-place upgrades from Windows 7 or Windows 8 that are focused on making device wipe-and-reload scenarios obsolete. Businesses will be able to customise an app store so that it to make it more specific to their needs and environment. The idea is to be able to create an app store that will allow for volume app licensing, flexible distribution, and the ability for organisations to reclaim or reuse licenses when necessary.

 

5. Interface tweaks

There are new interface enhancements, one of which is an expanded Start menu, replacing the Windows 8 navigation system. Microsoft says the familiar Start menu will be back, providing quick one-click access to the functions and files that people use most. And it includes a new space that can be personalised with favourite apps, programs, people and websites. Apps from the Windows Store will open in the same format as desktop programs. They can be resized and moved around, and have title bars at the top allowing users to maximise, minimize and close with a click. Working in multiple apps at the same time will be easier and more intuitive, with Snap Improvements. A new quadrant layout allows up to four apps to be snapped on the same screen. Windows will also show other apps and programs running for additional snapping, and it will make intelligent suggestions on filling available screen space with other open apps. In addition, there will be a new Task view button on the task bar, to enable one view for all open apps and files, which will allow the user to switch quickly, giving them one-touch access to any desktop created.

 

Finally, Windows 10 will have multiple desktops support. According to Microsoft, instead of too many apps and files overlapping on a single desktop, it will be easy to create and switch between distinct desktops for different purposes and projects — whether for work or personal use.

 

6. 2-in-1 features

Microsoft says that many of the new multitasking features will be optimised for touch devices as well, for example Task View. However, Windows 10 will also have a hybrid interface mode for 2-in-1 laptops and hybrid devices. This will contain elements of the current Windows 8.1 Start screen, but the new touch-focused start screen can switch based on the input used. Microsoft hinted that it will use large icons and response to gestures or swipes, as well as more traditional mouse or touchpad interaction, with smaller buttons and list-like interfaces.

 

7. Cloud integration

Windows 10 is likely to be more integrated with the cloud than previous versions of Windows. With products like Office 365, and OneDrive staying at the centre of Microsoft’s cloud strategy, and advances in Microsoft Windows Azure cloud platform, businesses can expect more Cloud integration.


8. Open and collaborative development

Microsoft has introduced the Windows Insider Program, through which it can work closely with customers to help shape the future of Windows. Windows Insiders will be able to give feedback on early builds of the product throughout the development cycle. The program will include various ways for Windows Insiders to engage in a two-way dialogue with Microsoft, including a Windows Feedback app for sharing suggestions and issues and a Windows Technical Preview Forum for interacting with Microsoft engineers and fellow Insiders.

 

Although it’s early days, Windows 10 is coming. Now is a good time to investigate the technical preview and see how it might benefit your organisation.

 

- Arif

Read more >

Will you be ready with the “Powerful Yes’s and Powerful No’s”?

In the latest episode of the Transform IT Show, those “Powerful Yes’s and No’s” made all the difference to our guest, Susan Cramm. Susan is a former CIO, former CFO of a large restaurant chain and is now a respected coach and author. She shared with us some contrarian views of what it means to be an effective IT leader…and so much of it seemed to come down to your ability to know yourself.

 

She shared with us that you must understand your passions and your gifts and honor them. But that you also needed to invest in relationships and in really understanding those around you. We talked about an article that she wrote where she said that sometimes the advice to put together a great team and then to let them do their job was actually dead wrong. That in many cases, the leader needs to take those toughest assignments themselves and to use them as an opportunity to develop their team. She explained that the problem is that sometimes as leaders we start out by reflecting what we think we should be or do, rather than reflecting who we are. And that lack of self-awareness is where things go sideways.

 

We also talked about another article that she wrote where she took issue with Sheryl Sandberg and her book “Lean In.Screen Shot 2014-12-08 at 3.07.26 PM.png” Her big issue wasn’t that the advice was wrong, per se, but that it just showed one possible way to handle the situation. In Susan’s essays, sometimes you just need to be prepared to “lean out” instead. She challenged that if your values aren’t in line with your company’s corporate values, then perhaps you should step away and create a culture of your own that’s in line with your own values.

 

It really all comes down to knowing who you are and what you want out of life. And to know that, you need to take the time to think deeply about what you want to achieve. It is only with that knowledge that you can put all of the pieces together. For her, that came down to what she called those “Powerful Yes’s and Powerful No’s”. They were those moments that she was ready to step up and volunteer for a position that didn’t exist…and those times when she walked away from an opportunity that just wasn’t right at that time. She explained that by having a mindset of abundance rather than scarcity, you can have the confidence to make those decisions because you will believe that “if not here, then somewhere else.”

 

She wrapped up our time together by reminding us that we really only have two assets: our time and our ability to influence others. And that in order to influence others (which is the essence of leadership), that we needed to invest in relationships and always be focused on making those relationships work as much for the other person as it does for you.

 

Susan offered us some great insights and some thought provoking contrarian advice. Through her fascinating career spanning everything from hands-on coding, to consulting to heading IT and to being a CFO, she has developed a unique perspective that most of us will never have the opportunity to experience. So it was a great pleasure to be able to have Susan share that perspective with us.

 

You can watch the replay of our latest episode – Stepping In and Leading Out to be an Exceptional Leader.

Read more >

Why Move Big Data Analytics into the Cloud?

movebigdatacloud.PNG

With the growth of new data types and the increasing volume of available data businesses need to implement analytics solutions to effectively extract the most benefit from the Big Data to which they have access. Meanwhile, data analytics is moving from batch to real time. This is particularly the case for predictive analytics, which can help the organisation to become future-focused.

 

Cloud is ideally positioned to provide the power and flexibility for Big Data analytics. Cloud computing itself has the potential to enhance business agility and productivity, while enabling greater efficiencies and reducing costs. Cloud and Big Data analytics technologies continue to evolve, and forward-thinking businesses are increasingly investigating both. A growing number of enterprises are building efficient and agile cloud environments that can process the large volumes, high velocity and varied formats of Big Data.

 

So, why move your analytics into the cloud? Here are three great reasons:

 

  • Cloud-based Analytics-as-a-Service (AaaS) has the power, flexibility and scalability to cope with Big Data. That said, cloud-based Big Data analytics is not a ‘one size- fits-all’ solution; organisations using cloud infrastructure to provide AaaS have multiple options. The cloud platform might vary, depending on factors such as workload, cost, security, and data interoperability. IT might choose to utilise their private cloud to mitigate risk and maintain control.

 

  • Businesses might prefer to use public cloud infrastructure, platform, or analytics services to further enhance scalability. Cloud service providers are offering various data analytics solutions to meet different IT needs – form MapReduce to more complex analytics packages.

 

  • IT might implement a hybrid model that combines private and public cloud resources and services.

 

Servers based on the Intel® Xeon® processor E5 and E7 families provide the performance and data handling capabilities for many different Big Data analytics environments. Advanced storage capabilities are also available through Intel Solid-State Drives (SSDs), featuring high-throughput and high endurance. Additionally, Intel Ethernet 10Gbit Converged Network Adapters provide high-throughput connections for large datasets.

 

The bottom line is: no matter which cloud delivery model makes the most sense, businesses with varying needs and budgets can unlock the potential of Big Data in cloud environments.

 

- Arif

Read more >

8 Things You Should Know About Windows 10

Microsoft unveiled its next Windows operating system, Windows 10, at the end of September 2014. The forthcoming operating system has features specifically designed for business, including an updated user experience and enhanced security and management capabilities.

 

8thingswindows10.PNG

Here are eight things businesses should know about Windows 10:

 

1. Key dates

Microsoft released an early technical preview for laptop and desktop version of Windows 10 on October 1, 2014. This is just over three years after Microsoft unveiled the first public beta build of Windows 8. Microsoft also released its Windows insider program on October 1, designed to keep early adopters up-to-date with the latest preview builds on Windows 10. Then, from October 7, the preview build was available to Windows 7 users as well. (However, consumer preview builds will not be available until early next year.) The technical preview, ends on April 15, 2015, timed to coincide with Microsoft’s Build 2015 conference. At the conference, Microsoft is likely to issue a release date for Windows 10. The company has promised that Windows 10 will ship to consumers and enterprises “later in the year” 2015.

 

On October 13, Microsoft announced that over 1 million people are currently testing the Windows 10 technical preview. This is likely to include a number of enterprises planning for the future.

 

2. Multi-device platform

According to Microsoft, Windows 10 will continue to follow Microsoft’s strategy of making its operating system a platform that is suitable for use on multiple devices. The company describes it as a converged application platform for developers on all devices. Consequently, developers will be able to write an application once and deploy it easily across multiple device types, says Microsoft. This includes desktop PCs, smart phones, tablets and Xbox consoles. Microsoft adds, “Windows 10 will run across the broadest range of devices ever from the Internet of Things to enterprise data centres worldwide.”

 

3. New security features

Microsoft is continuing to focus on adding enterprise security features to its operating system. Windows 10 will feature identity and information protection technology. The new operating system will also have new features around user identities, to improve resistance to breach, theft or phishing. Windows 10 will additionally help advance data loss prevention by using containers and data separation at the application and file level, enabling protection that follows the data as it goes from a tablet or PC to a USB drive, email or the cloud.

 

4. Device management changes

With Windows 10, management and deployment have been simplified to help lower costs. Microsoft says it will offer in-place upgrades from Windows 7 or Windows 8 that are focused on making device wipe-and-reload scenarios obsolete. Businesses will be able to customise an app store so that it to make it more specific to their needs and environment. The idea is to be able to create an app store that will allow for volume app licensing, flexible distribution, and the ability for organisations to reclaim or reuse licenses when necessary.

 

5. Interface tweaks

There are new interface enhancements, one of which is an expanded Start menu, replacing the Windows 8 navigation system. Microsoft says the familiar Start menu will be back, providing quick one-click access to the functions and files that people use most. And it includes a new space that can be personalised with favourite apps, programs, people and websites. Apps from the Windows Store will open in the same format as desktop programs. They can be resized and moved around, and have title bars at the top allowing users to maximise, minimize and close with a click. Working in multiple apps at the same time will be easier and more intuitive, with Snap Improvements. A new quadrant layout allows up to four apps to be snapped on the same screen. Windows will also show other apps and programs running for additional snapping, and it will make intelligent suggestions on filling available screen space with other open apps. In addition, there will be a new Task view button on the task bar, to enable one view for all open apps and files, which will allow the user to switch quickly, giving them one-touch access to any desktop created.

 

Finally, Windows 10 will have multiple desktops support. According to Microsoft, instead of too many apps and files overlapping on a single desktop, it will be easy to create and switch between distinct desktops for different purposes and projects — whether for work or personal use.

 

6. 2-in-1 features

Microsoft says that many of the new multitasking features will be optimised for touch devices as well, for example Task View. However, Windows 10 will also have a hybrid interface mode for 2-in-1 laptops and hybrid devices. This will contain elements of the current Windows 8.1 Start screen, but the new touch-focused start screen can switch based on the input used. Microsoft hinted that it will use large icons and response to gestures or swipes, as well as more traditional mouse or touchpad interaction, with smaller buttons and list-like interfaces.

 

7. Cloud integration

Windows 10 is likely to be more integrated with the cloud than previous versions of Windows. With products like Office 365, and OneDrive staying at the centre of Microsoft’s cloud strategy, and advances in Microsoft Windows Azure cloud platform, businesses can expect more Cloud integration.


8. Open and collaborative development

Microsoft has introduced the Windows Insider Program, through which it can work closely with customers to help shape the future of Windows. Windows Insiders will be able to give feedback on early builds of the product throughout the development cycle. The program will include various ways for Windows Insiders to engage in a two-way dialogue with Microsoft, including a Windows Feedback app for sharing suggestions and issues and a Windows Technical Preview Forum for interacting with Microsoft engineers and fellow Insiders.

 

Although it’s early days, Windows 10 is coming. Now is a good time to investigate the technical preview and see how it might benefit your organisation.

 

- Arif

Read more >

6 Ways To Improve Collaboration in the Enterprise

6waystoimprovecollab.PNG

Today’s businesses need to move fast, and good team-working and collaboration is foundational.

 

Here are eight ways to improve collaboration in the enterprise:

 

1. Enable BYOD/CYOD flexibility

Collaboration across the enterprise is enhanced when workers are free to be productive wherever they are. Bring Your Own Device (BYOD) and Choose Your Own Device (CYOD) schemes enable individual employees to work on the device that suits them best. People work in different ways, and there is a now massive range of computing devices available to businesses to offer their workforces. These include powerful smart phones, touch-screen tablets, 2-in-1 detachable and convertible notebooks, desktop-replacement Ultrabooks, clamshell mobile workstations, and all in one PCs. Added to these are the traditional fully-loaded desktops and mini desktops for home working, andoffice-based business workstations.

 

An important part of having BYOD/CYOD flexibility is to actually allow employees to work remotely on their device of choice. For some organisations, this will mean a change in policy or business culture. But as long as the business gives its workers freedom, whilst reinforcing that the organisation values collaboration and teamwork, employees will be both productive and loyal.

 

2. Centralise communication

It’s important to have some sort of central point for communication, delivered and managed by the business. This could be a fully-featured enterprise collaboration system, or it might be an intranet or social media platform. Providing that employees feel comfortable using the system to communicate and share information, collaboration will be the natural result. Cloud-based systems can offer access through any web-based device, which means that workers can collaborate at any time, from wherever they are. Clear and open communication is important in establishing productive collaboration. It’s also a good idea to have clear guidelines on how people should share information in an efficient way. If you can avoid duplicating information, and wasting time with irrelevant or repetitive communications, then your collaboration will be better for it.

 

3. Implement enterprise collaboration

There are many high-quality enterprise collaboration systems available. The great thing is you can set them up and have teams working together in minutes. Many enterprise collaboration tools have project or workflow management built into them, which helps teams to work together to a particular timeline. Microsoft SharePoint is one of the most popular enterprise collaboration tools on the market. It enables you to create online spaces for business projects; find and pull-in team members using search, and keep all project-related materials in one centralised location. Team members can also have their own blogs, through which they can collaborate and share expert information.

 

Another good tool is Yammer, which has an intuitive user interface. It also has social media features such as newsfeeds, likes and event hashtags, which help teams to engage and collaborate. Other tools worth noting are Huddle for online content; Jive, particularly for mobile and remote collaboration; and a cloud-based content management system called Box, which enables teams to securely create, upload and share content quickly and easily.

 

4. Use social networking

Use business-centric social networking, perhaps established channels like LinkedIn or Facebook, as they will give your teams the means to collaborate. They can also help you to create a sense of identity for the business. However, social networking requires clear guidelines for employees, because the systems are often public-facing, which means the company is on show to outsiders. Other enterprise social networking tools to consider are: tibbr, which allows you to customise your profile and follow employees who have similar interests.

 

Chatter is a secure social network for sales departments, which helps salespeople to work as a team. For teams that need to share videos, Kaltura allows businesses to store training workshops, product demos, and other video content in one central location. Videos can be posted privately or publicly and shared easily.

 

5. Encourage conferencing

Audio and video conferencing has become a must-have for every modern business. It’s now an essential tool for linking up disparate teams, pulling in experts, and connecting people across the world. There are some great on-premise conferencing systems, packing high-definition video and audio. These can project a highly professional corporate image. Meanwhile, the quality of web-based conferencing has improved to the extent that it is possible to create highly successful collaborative meetings at a moment’s notice, using something like Skype.


There are also a number of excellent web conferencing services, such as GoToMeeting, Adobe Connect Pro, GlobalMeet and Cisco WebEx. These offer features such as the ability to record a meeting; real-time screen sharing; file transfer and remote desktop control. The better services let the presenter pass controls over to others, and allow for participant annotation. They may also allow duplex support, so that meeting attendees can speak without having to take turns.

 

6. Create a culture of collaboration

Lastly, it’s important to create a culture of collaboration, leading by example. Business heads should be using collaboration to share and disseminate information. There should be training available to help workers to get the most out of the collaboration tools and software available to them. According to a study, by training firm ESI International, many members of the workforce don’t possess the skills required to effectively collaborate, so they need to be instructed and guided on how to do it. Research has also found that a very high proportion of business professionals need to improve their communication skills.

 

Professional development can include improving communication and collaboration skills, resulting in greater team working, productivity and efficiency.

 

In addition, many businesses have found that they have been able to improve collaboration by redesigning the workplace. Practical changes you can make include altering room and desk arrangements to better support mobile working, hot-desking, and video and audio conferencing. Breakout areas should facilitate conversation and collaboration, and flexible spaces with high-quality conferencing facilities will enable teams based in different locations to work together on the same project.

 

Having a collaborative business is as much to do with your business mindset as it is providing the right processes and technologies. How could you improve collaboration in your organisation?

 

- Arif

Read more >

Top 10 Things to Consider When Buying a Cloud Service

10thingsaboutcloud.PNGThe distinctive features of cloud computing are now widely recognised. Unparalleled processing capabilities, agile virtualisation, and fast Gigabit networking enable enterprises to process more information than ever before at high speed.

 

Because of its ‘on-demand’ nature, cloud can provide businesses with significant cost reductions compared with traditional computing models. Cloud has also brought flexibility, scalability and performance to enterprise computing, and this can be achieved through private, public or hybrid clouds.

 

However, it’s important to note that cloud service providers vary greatly in their offerings, technical capabilities, service levels and data centre technologies. It’s therefore prudent to carry out due diligence before signing a contract with a cloud services. Here are ten things to consider:

 

  1. PricePrice is important though it shouldn’t be a deciding factor. The cheapest service isn’t always the best one. Take a closer look at the features that each cloud provider offers, and then compare prices with this in mind. Look out for hidden costs, for example for data extraction, legacy application integration, and premium storage. There may be additional bandwidth charges, data access costs, migration fees and internal and external support requirements. Calculating the long term cost of externally provided cloud operations compared with on-premise is also a worthwhile exercise.
  2. SLAsAlong the same lines, scrutinise a potential service provider’s service level agreement. Try to find out: who is responsible for any losses due to cloud outage; what is the response time to fix failures or respond to customer requests and demands? Also, what is the performance of the service being offered: the response time, throughput, bandwidth, uptime, etc. When it comes to backup, who has the responsibility to carry it out; and where are backups stored? Lastly, what happens if you want to migrate off a certain cloud provider’s infrastructure? How badly could your service be affected during a data centre upgrade or software migration? According to Gartner, service providers differentiate themselves on such things as service reliability, support, ease of management, turnkey deployments and integrated value-add services.
  3. ScalabilityThe promise of cloud, and particularly hybrid cloud, is that workloads can be scaled up quickly and massively by bursting out to public cloud. Consequently, the ability for your prospective cloud service provider to scale on demand is an important one. Think of seasonal peaks in workload, or sudden business demand for analytics, workloads that involve floating point calculations, or high-bandwidth multimedia data. Can a cloud service provider’s infrastructure cope with your needs? Web-based enterprises should have growth in mind, and a good cloud services company should be able to accommodate your workloads and future growth.
  4. Porting to the cloudIf a cloud service provider is offering to migrate a particular legacy application to the cloud, try to understand whether there will be a one-off porting operation. If so, will the application be optimised for the new infrastructure, or left alone, no matter how inefficient it might be once migrated? If so, what sort of monitoring and maintenance will cloud service provider have for your application?
  5. SecurityCloud security is a maturing area, and one that is still at the top of the list for many enterprises. Cloud service providers can now implement security right across the stack, starting with encryption at the server processor level, running through their virtualisation systems, and right up to user access authentication and controls. How secure is your potential cloud partner? What type of security are they offering; and what data protection standards do they meet? In terms of access control, is it internal – within the data centre – as well as external?
  6. ExpertiseDifferent providers will have different specialties, so try to look into a company’s specialty or area of expertise before you sign. For example, they may be strong in a particular vendor technology, for example, Microsoft, SAP or Oracle. Or their strength might be in integrating cloud with legacy apps. The data centre engineers might be particularly good at optimising particular workloads for their customers, or they may have strong security credentials.
  7. HardwareDoes a potential cloud services provider have the latest technologies – server, storage and networking? Are they investing in software-defined infrastructure (SDI), as many cloud service providers are? If so, where are they on the road to having an agile and highly-responsive infrastructure – one where whole workloads can be moved at will, to optimise their performance? Try to find out the extent to which the service provider has implemented virtualisation in servers, storage and networking, and how effective their orchestration is.
  8. SupportIf the service goes down, are you able to call the company and connect with a real person on the phone? Are they able to find out what the problem is and answer your questions? Do they have the right access to engineers who can give them a timeframe for fixing the issue? Also, are staff on call around the clock? Can they be reached in a timely manner? These are good things to find out. Technical support is a really important area to investigate. Talking to existing customers will help you get a sense of how responsive the cloud services firm really is.
  9. PowerPower is a major data centre concern. You want your cloud services provider to be using power-efficient servers and equipment, because it means its operations are cost efficient. This, in turn, will have an impact on the cost of the service. A cloud service provider’s carbon footprint is also worth trying to get a sense of, in order to know how green their data centres are.
  10. ExtrasOther considerations include how well the cloud service provider will integrate with your private cloud or on-premise systems. Do they expect you to do the bulk of the integration? Will they be able to offer you tools and help, if you need them? Also, what sort of monitoring dashboards do they offer you to keep track of the service? Is there a comprehensive control panel, or is it difficult to monitor the various applications and service levels? Do you have an input into the user interfaces that might be deployed? Are you able to customise them, particularly if the interfaces are non-intuitive and hard to use? If this is the case, then there may be trouble ahead.

 

Finally, disaster recovery and business continuity are areas worth investigating. Try to find out how well data is backed up and restored in the case of emergency. Make sure you are satisfied with the level of backup, and the speed at which services can be restored if required.

 

With these things in mind, choosing the right cloud services provider should be straightforward.

 

- Arif

Read more >

How Can HPC Assist Mainstream Businesses?

hpcassistbusiness.PNGHigh-Performance Computing (HPC) isn’t just for high end corporations and large scientific organisations. The cost of processing, coupled with the raw power of today’s servers means that small and mid-sized businesses can also benefit from the advanced simulation that HPC provides.

 

Simulation can assist with many elements of product design, says Stephan Gillich, Director of Technical Computing for the Intel EMEA Datacenter Group.

 

It’s particularly useful for computer-aided engineering e.g. in classical fields like mechanics and fluid dynamics, but also in finance life sciences and digital content creation, says Gillich.

 

The key point is that simulation is no longer just in areas where it has already been used for a long time, such as the aerodynamic design of planes. Now, you’ll find it in other product design areas to determine, for example, what happens to the components of a mobile phone when it hits the ground accidentally.

 

“Mainstream businesses can now access simulating compute capacity on HPC clusters more easily and at a very reasonable cost,” says Gillich. This is enabling them to go beyond the limitations of the workstations they currently use.

 

The small automotive supply engineering house Dörrer + Broßmann carried out a proof-of-concept to see how a cluster based on the Intel® Xeon® processor E5 product family can enable it to carry out more-sophisticated engineering simulations more quickly.

 

The more precise simulation services also open up opportunities for Dörrer + Broßmann to pitch for business that was previously too compute-intensive for the company to carry out.

 

From a technology perspective, the server platform has improved, offering more powerful processors with more cores that are capable of handling more data in one operation, for example using Advanced Vector Extensions.

 

Besides the processor, sophisticated network solutions such as Intel® TrueScale, and improved storage solution using Intel SSD, provide additional big improvements. On top of this, the software can be optimised so that processing is parallelised and makes the most of available compute cycles. The result is that we see the potential and benefits of simulation on a scale never seen before by SMEs: a “democratization” of HPC, comments Gillich.

 

Sectors such as life science are growing as businesses – as well as scientists – take hold of the opportunities that HPC offers them.

 

Technical computing Cloud technologies promise flexible resources with on demand computing, democratizing even further.

 

However, in order to deliver customers a high-performance experience without the complexity, there needs to be integration. “Basically people need the sort of interface they’re used to on a single workstation,” says Gillich.

 

The main business benefits of HPC simulation are these:

  • firms have the opportunity to increase their competitiveness
  • bring better products to market faster.


They can also cut product design costs, for example by simulating drop tests, product breakages, or the effect of water or pressure damage. There are cost-effective, entry-level HPC packages supported by hardware and software vendors such as ANSYS, Altair, and others, and new clusters are easier to set up.

 

Data centre and server management has also come a long way, with today’s servers, and cloud-based Technical Computing solutions offering flexibility for changing workloads, and faster time to setup.

 

Have you considered moving from workstation to HPC computing?

 

- Arif

Read more >

8 Ways to Secure Your Cloud Infrastructure

8waystosecurecloud.PNG

Cloud security remains a top concern for businesses. Fortunately, today’s data centre managers have an arsenal of weapons at their disposal to secure their private cloud infrastructure.

Here are eight things you can use to secure your private cloud.

 

1. AES-NI data encryption

End-to-end encryption can be transformational for the private cloud, securing data at all levels through enterprise-class encryption. The latest Intel processors feature Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), a set of new instructions that enhance performance by speeding up the execution of encryption algorithms.

 

The instructions are built into Intel® Xeon server processors as well as client platforms includingmobile devices.

 

When encryption software utilises them, the AES-NI instructions dramatically accelerate encryption and decryption – by up to 10 times compared with software-only AES.

 

This speedy encryption means that it is possible to incorporate encryption across the data centre without significantly impacting infrastructure performance.

 

2. Security protocols

By incorporating a range of security protocols and secure connections, you will build a more secure private cloud.

 

As well as encrypting data, clouds can also use cryptographic protocols to secure browser access to the customer portal, and to transfer encrypted data.

 

For example, Transport Layer Security (TLS) and Secure Sockets Layer (SSL) protocols are used to assure safe communications over networks, including the Internet. Both of these are widely used for application such as secure web browsing, through HTTPS, as well as email, IM and VoIP.

 

They are also critical for cloud computing, enabling applications to communicate over the network and throughout the cloud while preventing undetected tampering that modifies content, or eavesdropping on content as it’s transferred.

 

3. OpenSSL, RSAX and function stitching

Intel works closely with OpenSSL, a popular open source multiplatform security library. OpenSSL is FIPS 140-2 certified: a computer security standard developed by the National Institute of Standards and Technology Cryptographic Module Validation Program.

 

It can be used to secure web transactions through services such as Gmail, e-commerce platforms and Facebook, to safeguard connections on Intel architecture.

 

Two functions of OpenSSL, that Intel has contributed to, are RSAX and function stitching.

 

The first is a unique implementation of the popular RSA 1024-bit algorithm, and produces significantly better performance than previous OpenSSL implementations. RSAX can accelerate the time it takes to initiate an SSL session – up to 1.5 times. This provides a better user experience and increases the number of simultaneous sessions your server can handle.

 

As for function stitching: bulk data buffers use two algorithms for encryption and authentication, but rather than encrypting and authenticating data serially, function stitching interleaves instructions from these two algorithms. By executing them simultaneously, it improves the utilisation of execution resources and boosts performance.

 

Function stitching can result in up to 4.8 times performance improvement for secure web servers when combined with RSAX and Intel AES-NI.

 

4. Data loss prevention (DLP)

Data protection is rooted in the encryption and secure transfer of data. Data loss prevention (DLP) is a complementary approach focused on detecting and preventing the leakage of sensitive information, either by malicious intent or inadvertent mistake.

 

DLP solutions can profile content against rules and capture violations or index and analyse data to develop new rules. IT can establish policies that govern how data is used in the organisation and by whom. By doing this they can clarify security practices, identify potential fraud and avert accidental or unauthorised malicious transfer of information.

 

An example of this technology is McAfee Total Protection for Data Loss Prevention. This software can be used to support an organisation’s governance policies.

 

5. Authentication

Protecting your platform begins with managing the users who access your cloud. This is a large undertaking because of the array of external and internal applications, and the continual churn of employees.

Ideally, authentication is strengthened by routing it in hardware. With Intel Identity Protection Technology (Intel IPT), Intel has built tamper-resistant, two-factor authentication directly into PCs based on third-generation Intel core vPro processors, as well as Ultrabook devices.

 

Intel IPT offers token generation built into the hardware, eliminating the need for a separate physical token. Third-party software applications work in tandem with the hardware, strengthening the authentication process.

 

Through Intel IPT technology, businesses can secure their access points by using one-time passwords or public key infrastructure.

 

6. API-level controls

Another way in which you can secure your cloud infrastructure is by enforcingAPI-level controls. The API gateway layer is where security policy enforcement and cloud service orchestration and integration take place. An increased need to expose application services to third parties, and mobile applications is driving the need for controlled, compliant application service governance.

 

WithAPI-level controls, you gain a measure of protection for your departmental and edge system infrastructure, and reduce the risk of content-born attacks on applications.

 

Intel Expressway Service Gateway is an example of a scalable software appliance that provides enforcement points and authenticates API requests against existing enterprise identity and access management system.

 

7. Trusted servers and compute pools

Because of cloud computing’s reliance on virtualisation, it is essential to establish trust in the cloud. This can be achieved by creating trusted servers and compute pools. Intel Trusted Execution Technology (TXT) builds trust into each server, at the server level, by establishing a root of trust that helps assure system integrity within each system.

 

The technology checks hypervisor integrity at launch by measuring the code of the hypervisor and comparing it to a known good value. Launch can be blocked if the measurements do not match.

 

8. Secure architecture based on TXT

It’s possible to create a secure cloud architecture based on TXT technology, which is embedded in the hardware of Intel Xeon processor-based servers. Intel TXT works with the layers of the security stack to protect infrastructure, establish trust and verify adherence to security standards.

 

As mentioned, it works with the hypervisor layer, and also the cloud orchestration layer, the security policy management layer and the Security Information and Event Management (SIEM), and Governance, Risk Management and Compliance (GRC) layer.

 

Conclusion

Cloud security has come a long way. It’s now possible, through the variety of tools and technologies outlined above, to adequately secure both your data and your user. In so doing, you will establish security and trust in the cloud and gain from the agility, efficiency and cost savings that cloud computing brings.

 

- Arif

Read more >

Three Innovations Coming to Mobile in 2015

3innovations.PNG

There are a number of innovations in business mobile debuting from Intel and mobile device manufacturers in 2015. These will help enterprises to improve the user experience and productivity of their mobile work forces, and help IT bridge the gap between new user demands and the mobile equipment they offer.

 

  • Form factor innovation

The forthcoming low-power, high-performance Intel Core M processor will enable form factor innovations such as razor-thin tablets that can offer the performance of a PC.

 

Ultra-thin 2 in 1s are also going to be attractive to businesses who will be able to offer smaller and better form factors to their mobile workers and business travellers.

 

These form factors will bridge the gap between new user demands for business client devices, and what the business can offer. The new super-thin tablets and notebooks will offer improvements in user experience, collaboration, productivity and portability.

 

 


  • No wires technology

The second innovation coming to mobiles is ‘no wires’ technology, which will simplify life for everyone. It will be a boon for mobile workers and IT departments, and continue the trend of workplace transformation.

 

No wires docking, through Intel WiGig-based Wireless Docking technology, will transform meeting rooms. It gives workers ‘walk up’ convenience as it remembers their preferences, making them instantly productive.

 

No wires docking can be used in combination with wire-free keyboards, mice, printers and monitors – such as the Intel Pro Wireless Display – and is poised to improve office working and modernise both consumer and business computing.

 

Another exciting development coming in the future is ‘no wires charging’ for mobile devices. This will enable mobile workers to lay their device down on the desk so it can be charged automatically.

This technology will accompany existing enterprise-class technology such as Intel vPro, which provides security and manageability to wire-free environments. Intel vPro can make wire-free meeting rooms secure, offer remote management and features such as transferring screens from one room to another to enhance collaboration.

 

  • Stronger, simpler security

Thirdly, more advanced security – which is simpler both to use and administer – is coming to mobile devices. New security innovations will simplify the sign-on process, and reduce the number of passwords required.

 

Mobile devices will increasingly use multi-factor authentication, particularly biometric data such as face recognition, to improve business security. This ‘no passwords’ approach will simplify life for end-users, who will be able to unify their passwords by replacing them with a single biometric input instead.

 

As for IT departments, multi-factor authentication will harden the security of the platform, and will also simplify security management.

 

It will mean single sign-on for multiple cloud services through face recognition or two-factor authentication, and will utilise existing manageability and security features offered by Intel vPro technology.

 

Mobile computing is about to enter a new phase of sophistication.

 

- Arif

Read more >

Accelerating the Transformation to Software Defined Storage

We are all aware of the explosion of data happening that’s driven by social media, streaming video, and other content-rich applications. Most of it needs to be stored somewhere, and many new applications and devices consume more resources and need to be deployed more rapidly than traditional back office enterprise applications.softwaredefinedstorage.PNG

 

Six months is now too long to wait to deploy a new application or service. IT organisations now need to move in a matter of hours. This means the storage environment needs to be flexible, scalable and responsive. In other words, data centres need to be more cloud-ready or software-defined, presenting numerous challenges for the traditional storage environment in today’s data centre.

 

In order for data centres to transform into scalable, flexible environments,storage must evolve from purpose-built, dedicated silos designed for specific applications to more general large pools which are dynamically allocated and controlled via the software-defined storage control layer.

 

Intel says its vision of software-defined storage (SDS) is a framework that enables dynamic, policy-driven management of storage resources – a world where the application defines the storage.

“Intel is helping to accelerate this storage transformation with processor innovations, non-volatile memory, and networking and fabric products optimised for storage workloads,” says Andreas Schneider, Intel EMEA StorageProduct Marketing Manager.

 

“We are actively engaged with partners in the ecosystem to deliver optimized SDS solutions that take advantage of these latest technologies and have reference architectures, or ‘recipes’, available as well.”

For example, Intel® Storage Acceleration Library provides optimised algorithms that streamline the path through the processor, improving performance of storage functions like deduplication, compression and erasure coding. These libraries are available to storage OEMs and ISVs and contributed to the open source community.

 

Fujitsu recently announced a new hyperscale SDS appliance based on Intel technologies and Ceph open source software. The appliance is ideal for cost-sensitive users who need instant online access to large data volumes. Fujitsu and Intel collaborated to integrate Intel processors, SSDs, and software technologies such as Virtual Storage Manager (VSM). VSM is an open source software tool which simplifies Ceph cluster setup and management.

 

To continue moving forward with the SDS vision, we need the SDS controller, which has the visibility and control of all storage resources, as well as communication between applications, orchestrator and storage systems. This SDS controller needs to be based on open source standards to allow for interoperability across hardware platforms.

 

Schneidersays, “We are working with the community to develop an open SDS controller prototype to help demonstrate the concept and validate the value proposition. As part of this effort, Intel is looking to partner with the OpenStack community to help solve key storage challenges.”

 

Where are you on the path to SDS? Does the requirement for the SDS controller resonate with you?

 

- Arif

Read more >

The How and Why of Wearables at Work

The way we live and engage with technology has been forever changed by the impact of wearables. While there have been some early adopters in certain industries — manufacturing, healthcare, and law enforcement, among others — wearable technology has primarily been consumer-focused to this point: smart glasses, shirts, cameras, and jewelry. But slowly and surely, wearable computers are making their way into the workplace.


intel-smartwatch.jpgWearables at Work: “Odd, Yet Intriguing”

 

IT decision makers need to keep their ears open. Al Sacco recently wrote on CIO.com, “In fact, some experts think the true potential of wearable tech, the future of these odd yet intriguing gadgets, lies in enterprise or business use…Experts say smart CIOs and IT managers should be proactive in preparing for corporate wearables but also wary of embracing novel and untested devices.”

 

What Makes a Wearable Device Right for an Enterprise?

 

Blue Hill Research analyst James Haight recently interviewed Steve Holmes, vice president of smart device innovation at Intel, on wearables and the Internet of Things. Holmes highlighted three unique advantages to wearables, which might help CIOs avoid products that aren’t going to add long-term value:

 

  1. Persistence. They can be worn all the time (think watches, bracelets, glasses) allowing the user to measure things continuously.
  2. Intimacy. They can pick up information that remote devices cannot (such as heart rate).
  3. Immediacy. There is no need to open a computer or take out a phone to receive information.

 

Device-Enabled Freedom

 

Devices with these three assets enable apps to be part of a user’s daily life on demand without being disruptive. Enterprise-based wearables have the capacity to change the way employees interact with both their workspace and colleagues. There is potential for advancement in productivity and freedom by allowing employees to break away from their desks and mobile devices to engage in a whole new way, via voice commands, gestures, and natural movement. According to Karen Barrett at BizTech Magazine, “At the shallow end, wearables can enhance productivity by keeping employees connected. Going to a deeper level, hands-free devices that respond to voice commands and employ built-in cameras will transform the workplace for many professionals.”

 

Advancements in Workplace Wearables

 

Our Make It Wearable campaign invited designers from around the world to create prototypes for a chance to shape the future of wearables using Edison technology.

 

One of the 10 finalists and most enterprise-friendly concept was Blocks, an open source modular smart watch that allows users to customize their experience. By design, these blocks link to one another and are each embedded with electronics that carry out a different function. This design allows the user to interchange blocks as new ones are developed or upgraded, as opposed to buying a new smart watch when new functions are introduced.

 

While there are still problems to overcome with early wearable technology — security, cost, engagement, and privacy — innovators are challenging our perception of what wearable technology can be by creating more connectivity, integration, and interaction between the user and wearable. Through immediacy, intimacy, and persistence, wearables will offer the promise of time and money savings through real-time solutions.

 

To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

Read more >

Empowering Your Workforce Through Mobile Collaboration

With mobility slated to top enterprise CIO priorities in 2015, clear internal communication strategies and robust collaboration platforms are crucial for continued success. As enterprise companies continue to heavily invest in IT infrastructure that enables their employees to untether from their desks and work remotely, collaboration software like Microsoft* Lync and Skype enable workers to remain productive and responsive — even out of the office.

Intel_SSG Lync and Skype Image_3-01.png

As companies have developed mobility strategies, many have invested in hardware such as tablets and smartphones in order to provide employees a modicum of flexibility. CIOs have discovered that due to their mobility strategies:

 

…employees in the field complete mission-critical tasks in real-time; they no longer gather information in one place, and then return to an office to transcribe what they learned on the road. Because workers are constantly connected, they’re always able to communicate, and are more likely to keep working even during off-hours.

 

Though it has proven to be an invaluable strategy across the increasingly complex enterprise landscape, the increase in productivity through mobility is still evolving.

 

Refining Mobility in 2015

                                                                                                           

 

Mobility is no longer a “nice to have” for employees — it’s expected. However, as the enterprise continues to adopt mobile technology, flaws are starting to appear. As BYOD and mobility have evolved, cross-platform collaboration has been somewhat limited by compatibility issues between operating systems.

 

Collaboration platforms like Microsoft* Lync* and Skype* have apps for most popular mobile operating systems, but functionality across some of these mobile platforms may be significantly limited compared to the desktop client.

 

For example, Microsoft* recently announced support for unified Lync & Skype communication (meaning Lync users can call and message Skype users, and vice versa). However, this functionality is limited to desktop clients for both devices. This feature, as well as many others, has been a major stumbling block for companies trying to promote robust collaboration efforts in conjunction with their mobility strategies.

 

Companies looking to refine their mobility strategy should consider the range of cross-platform compatibility issues and productivity limitations when looking to invest in any new device or strategy. A recent Principled Technologies test report compared Microsoft* Lync and Skype* performance on three popular mobile devices: an Apple* iPad* Air, Samsung* Galaxy* Note 10.1, and Microsoft* Surface* Pro 3.

 

The feature set and performance for the apps on the iPad* and Galaxy* Note was significantly limited compared to the Surface* Pro 3. Since the Surface* Pro 3 runs the full desktop version of Windows* 8.1 Pro, as opposed to a mobile operating system like the iPad* and Galaxy* Note.

 

Click here to read the full Principled Technologies test report.

 

*Other names and brands are property of others

Read more >

Identifying Your Mobile Device Management Strategy

Not all roads lead to BYOD.

 

Business paths diverge when it comes to mobile device management (MDM) strategy; as consumerization and mobility have become more prevalent within the enterprise, so has the variety in both corporate and customer requirements. According to Hyoun Park, “The proliferation of mobile devices has led to a similar proliferation of enterprise mobility support models. As your organization considers how to move forward to support mobile devices, applications, data, content, and unified communications, keep in mind how enterprise mobility is currently supported within your organization.”


Park states that the blanket term of BYOD can be broken down into eight specific categories that better represent specific strategies and objectives for the business.

 

Eight Ways to Structure Your MDM

 

COLD: Corporate Owned, Locked Down

Provides both a secure device and secure gateway, with rigorous policies surrounding lost or stolen devices. “In today’s world, this model has only become even more secure with the encryption of voice calls, multifactor authentication, content and application virtualization to prevent improper sharing, and sandboxes used to isolate applications and content.”

diff devices.jpg

COBRA: Corporate Owned, Business Ready Applications

New employees are given corporate devices pre-loaded with applications geared towards the mobile worker. “This might be as simple as including Dropbox, Box, or Evernote. This could also include mobile CRM and ERP applications, help desk applications, and productivity enablers.”


COPE: Corporate Owned, Personally Enabled

All devices are compartmentalized into corporate-owned and personally-owned. “This can be done by dedicating an enterprise-specific portion of the device to the applications and documents used in the workplace, while dedicating the rest of the device to Facebook, Angry Birds, personal e-mail accounts, and whatever else the employee wants to put on the device.”


CAPO: Corporate Approved, Personally Obtained

Devices are purchased by employees, but must meet corporate guidelines. “These standards can be as simple as supporting the company’s security or mobile device management standards or as complex as defining specific policies to shut off nearfield communications, camera, and other functions.”


EQUAL: EQuipment Under Approved List

EQUAL is a version of CAPO; all devices or operating systems are company ordained. “This allows companies to focus on the devices and operating systems they support without being overwhelmed by the evolution of mobility across every possible platform. However, the focus comes at the potential cost of creating a new version of shadow IT from unsupported devices.”


PEER: Personally Equipped, Enterprise Ready

PEER is a version of the COPE model; rather than the company funding the device, the employee makes the purchase instead. “The PEER model allows companies to put business applications, security, and governance onto a personally owned device. Employees agree to give businesses the control needed to transmit and support these applications.”


POOR: Personally Owned, Office Required

A somewhat controversial model that dictates employees must fund a device (sans employer compensation) in order to fulfill job requirements. “POOR is expected to become more troublesome as states increasingly see class action lawsuits that, like Cochran, are created based on a combination of state labor laws and BYOD requirements.”


CHAOS: Corporate Handles All Operating Systems

Often IT’s least favorite BYOD option, this means corporate supports all operating systems regardless of platform. “From an operational perspective, this approach often results in users falling through the cracks as IT is unable to provide employees with enterprise applications because vendors have never developed them for a specific platform. And from a support perspective, IT is constantly on the phone with additional support staff to troubleshoot unfamiliar devices.”

 

As the device market continues to evolve, remember to refresh your strategy and policies often to keep pace with our ever-changing world.

 

To continue this conversation, please follow us at @IntelITCenter or use #ITCenter.

Read more >

The New Free Cyber Warfare Range is Open to the Public

I am excited for the opening of the free Cyber Warfare Range. I had the pleasure of meeting the team at Arizona Cyber Warfare Range (ACWR) and getting an exclusive tour of their virtual warfare range. During this guide of their internal architecture, I was able to get some insights into controls which protect their cyber warfare environment.  It is vital the activities which occur inside the range do not get loose and directly impact the real world.


800 (2).jpg


So what is a virtual warfare range?


A virtual warfare range is an open-source, virtual location where security professionals can test their skills and programs in a simulated environment. You can conduct dangerous activities in a safe, isolated, and controlled space.  You can think of it like a cybersecurity gun range or paintball arena. For example, users are encouraged to hack the servers, compromise networks, break software, test the robustness of products, and even play with toxic malware (in specialized ranges).  Customized environments can be created to attack or defend.  All of which are important learning experiences for security professionals.

 

It’s no secret that security professionals need practical, real-world experience. However, it is never recommended to do dangerous activities on production, personal, or work networks as it is a recipe for harmful unintended consequences.  As a vitally important resource, the warfare range provides a free, internet accessible, and safe place where  novices and experts alike can learn and test their skills while conducting more specific activities – such as testing products, evaluating malware, etc…

 

The ACWR is simply a safe environment for learning by doing. Hacking, testing, war games, malware practice, product evaluations, and real opponent challenges help security professionals hone their skills in an isolated setting. Beginner and advanced ranges provide teaching challenges, customizable environments, analysis, and metrics. The site encourages users to go wild, ‘burn systems to the ground’, and do whatever it takes to learn and improve.

 

No more excuses, time to get learning.

 

- Matthew Rosenquist

 

To find out more visit the Arizona Cyber Warfare Range Website: http://www.azcwr.com/


Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

My Blog: Information Security Strategy

Read more >

Intel Goes Platinum for OpenDaylight Project (ODL)

This blog is a summary of a conversation between Uri Elzur, Director of SDN architecture and OpenDaylight Board Member and Chris Buerger, Technologist within Intel’s Software-Defined Networking Division (SDND) marketing team. It outlines the motivation and plans driving Intel’s decision to increase its OpenDaylight Project membership to Platinum.

 

Chris: Intel has been a member of the OpenDaylight Project since its inception. We are now announcing a significant increase in our membership level to Platinum. Explain the reasoning behind the decision to raise Intel’s investment into ODL.

 

Uri: At Intel, we have been outlining our vision for Software Defined Infrastructure or SDI. This vision is taking a new approach to developing data center infrastructure to make it more agile so it works in a more automatic fashion to better meet the requirements that shape the data centers of tomorrow.  Some of us fondly call the force shaping it  ‘cloudification. ’

 

SDI is uniquely meeting customer needs at both the top and the bottom line. Top line refers to greater agility and speed to develop data center scale applications, which in turn allows accelerated revenue generation across a larger number of our customers as well as the introduction of new, cloud-centric business models. At the same time, SDI also uniquely allows for the reduction of total cost of ownership for both service providers and their end-user customers. Service Providers are under intense competitive pressure to reduce cost, be it the cost of a unit of compute or, at a higher level, cost for a unit of application where an application includes compute, network, and storage.

 

Mapping this back to SDN and OpenDaylight, it is important to Intel to help our customers to quickly and efficiently benefit from this new infrastructure. To do that, we need to support both open and closed source efforts. OpenDaylight represents an open source community that has been very successful in attracting a set of industry contributors and that has also started to attract large end-user customers.

 

At this point in time, we see our efforts across multiple SDI layers that also include OpenStack and OpenVSwitch in addition to OpenDaylight come together in a coordinated way. This allows us to expose platform capabilities all the way to the top of the SDI stack. For example, by allowing applications to ‘talk back’ to the infrastructure to express their needs and intents, we are leveraging the capabilities of the SDN controller to optimally enable Network Function Virtualization workloads on standard high volume servers. This gives cloud service operators, telecommunication providers and enterprise users’ superior support for these critical services, including SLA, latency and jitter control, and support for higher bandwidths like 40 and 100 Gigabit Ethernet. Among open source SDN controllers, OpenDaylight has shown healthy growth based on the successful application of open source principles such as meritocracy. We are excited about the opportunities to work with the OpenDaylight community as part of our wider SDI vision.

 

Chris: As Intel’s representative on the Board of the OpenDaylight Project, what do you envision as the key areas of technical engagement for Intel in 2015?

 

Uri: Keeping our customer needs and the wider SDI vision in mind, our first priority is to really exercise the pieces that the community has put together in OpenDaylight on standard high volume servers to deliver the benefits of SDN to end-users. We are also going to work with our community partners as well as end-user customers to identify, validate, and enhance workloads that are important to them – i.e. optimize the hardware and software on our platform to better support them. For example, take a look at the work being done in the recently announced OPNFV initiative. We are planning to take use cases from there and help the community optimize the low-level mechanisms that are needed in an SDN controller and further to the

 

Chris:  The enablement of a vibrant ecosystem of contributors and end-users is critical to the success of open source projects. What role do you see Intel playing in further accelerating the proliferation of ODL?

 

Uri: We think Intel has a lot to bring to the table in terms of making the ODL community even more successful. Intel has relationships with customers in all of the market segments where an SDN controller will be used. We have also demonstrated our ability to create environments where the industry can test drive cutting edge new technologies before they go to market. For SDI, for example we created the Intel® Cloud Builders and Intel® Network Builders ecosystem initiatives to not only test the SDN controller, but couple it with a more complete and realistic software stack (SDI stack) and a set of particular workloads as well as Intel platform enhancements to establish performance, scalability and interoperability best practices for complex data center systems. And bringing this experience to OpenDaylight accelerates the enablement of our SDI vision.

 

Chris:  Software Defined Networking and Network Function Virtualization capabilities are defined, enabled and commercialized on the basis of a multitude of standards and open source initiatives. How do you see Intel’s ODL engagement fitting within the wider efforts to contribute to SDN- and NFV-driven network transformation?

 

Uri: Our answer to this question has multiple parts. One change that we have seen over the last few months is a shift in organizations such as ETSI NFV that, while always considering SDN to be reasonably important, never placed much emphasis on the SDN controller. This has changed. The ETSI NFV community has come to terms with the idea that if you want scalability, a rich set of features, automation and service agility, then you need an SDN controller such as OpenDaylight as part of the solution stack. And we believe that ETSI represents a community that wants to use the combination of OpenDaylight, OpenStack and a scalable, high-performing virtual switch on low cost, high volume server platforms.

 

We have also observed some interesting dynamics between open source and standards developing organizations. What we are witnessing is that open source is becoming the lingua franca, a blueprint of how interested developers demonstrate their ideas to the rest of the industry as well as their customers. Open source promotes interoperability, promotes collaboration between people working together to get to working code and then it is presented to the standard bodies. What excites us about OpenDaylight is that as a project it has also been very successful in working with both OpenStack and OpenVswitch, incorporating standards such as Openflow and OVSDB. Moreover, interesting new work on service chaining and policies is happening in both OpenDaylight as well as OpenStack. And all of these initiatives align with network management modelling schemas coming out of the IETF and TOSCA.

 

All of these initiatives are creating a working software defined infrastructure that is automated and that helps to achieve the top and bottom line objectives, we mentioned. OpenDaylight is a central component to Intel’s SDI vision and we are excited about the possibilities that we can achieve together.

Read more >

Tablet PCs & Next-Gen Healthcare

Digital innovations in healthcare are streamlining daily tasks, enabling clinicians to provide faster, accurate care, as well as empowering patients to take a bigger role in monitoring their own health. From big data to tablets to apps and smart watches, this technological shift is giving the healthcare industry an overhaul. With clinicians adopting digital record keeping, remote monitoring and care for patients, and other software as a service (SaaS) platforms, there is enormous potential to not only dramatically reduce administrative costs by up to $250 billion a year, but to also deliver a new level of sophistication and accuracy with regards to patient care.

Untitled.jpg


According to a recent Forbes article, digitizing care is no longer something that healthcare providers can afford to ignore. Many industries already use technology and data to improve efficiency and quality, and healthcare providers who fail to use digital innovations to their advantage may find themselves losing patients to their competitors.

 

Mobile devices like tablets allow clinicians to optimize patient care through the use of advanced technology. A recent survey found that nearly 70% of clinicians in U.S. hospitals use tablets. According to the same study, 1 out of 3 healthcare providers report that using mobile devices increases their efficiency. These devices improve clinicians’ ability to communicate with patients and other healthcare providers, multitask, and access information such as test results that used to be tethered to desktop PCs and printouts stuffed in folders.


Pioneering the Healthcare of Tomorrow

 

With recent digital innovations in healthcare, doctors, nurses and other health professionals are looking to new mobile devices like tablets to enhance their capabilities and offer them versatility in and out of the exam room. However, with an excess of tablets and mobile devices to choose from, finding the right one can be difficult. Thankfully, with the help of a recent Principled Technologies report, choosing a tablet isn’t brain surgery.

 

rightdevice.jpg

The report compared the performance of the following popular tablets based on tasks healthcare professionals encounter each day: Microsoft Surface Pro 3*, HP ElitePad 1000 G2*, Dell Venue 8 Pro*, Apple iPad Air* and mini*. The Intel-powered Dell Venue 8 Pro*, HP ElitePad 1000 G2*, and Surface Pro 3* outperformed both the iPad Air* and iPad Mini* in a number of categories.

 

The Intel-powered devices in the study offer features like the ability to work in multiple apps simultaneously, create tasks with speech-to-text, load files from USB peripherals, and wirelessly print documents from the popular Allscripts Wand software.

 

For detailed comparisons of each device, check out the following case studies:

 

Microsoft Surface Pro 3* vs. Apple iPad Air*; HP ElitePad 1000 G2* vs. Apple iPad Air: Dell Venue 8 Pro* vs. Apple iPad mini*.

 

*Other names and brands are property of others

Read more >

5 Most Interesting Security Metrics in the Q3 2014 McAfee Threat Report

The McAfee Labs Threat Report for Q3 2014 is out.  (McAfee is part of Intel Security)  As one of my longstanding benchmarks to track malware growth and velocity, this issue does not disappoint. 

Here are my Top 5 most interesting metrics, every security professional should be thinking about.

  1. Signing Malware continues to skyrocket as a practice by attackers, more than doubling to 40 million samples, a growth of over 1000% in two years!
    McAfee Q3 2014 - Signed Binaries.jpgSigning malware with legitimate and trusted certificates is a great tactic for attackers to get their harmful files past network filters and security controls to be installed by unaware users.  We will see this trend continue, because it works.  In fact, I predict a more mature market to emerge for selling and using stolen credentials by hacking communities and darknet enterprises.  Be careful who you trust. 
    “Trust is the currency of security, without it we are bankrupt.”
  2. New Malware is created at a rate of over 5 per second, 307 per minute
    McAfee Q3 2014 - New Malware.jpgThe relentless onslaught of malware production continues to grow at a tremendous pace.  Can attackers sustain this insane growth rate?  Yes.  Malware is easy to create, customize, and deploy.  More advanced and well-funded attackers have the ability to produce more complex malicious software to compromise systems and environments.  Take all necessary precautions and expect this trend to persist.  Rely on security products, services, architectures, vendors ,and employees who can keep pace with the attackers.
  3. Total Malware in existence exceeds 300 million, growing 76% over the past year
    McAfee Q3 2014 - Total Malware.jpgThe malware zoo grows every year and now exceeds 300 million distinct samples.  It is mind boggling that we must be protected against each of these critters.  The electronic world is truly a hazardous place.  For organizations, establishing a comprehensive layered set of defenses, starting at the perimeter, supported within the network, reinforced with specialized communication protections (web, email, IM, etc.), embedded on client devices, and with good judgment of users, is the only way to survive the onslaught over time. 
  4. Mobile malware jumps 112% from last year
    McAfee Q3 2014 - Mobile Malware.jpgRisks of malware on our mobile devices continue on a steady rise.  Not a sexy news grabbing story, but how long can we ignore these growing threats to our most used computing device? 
  5. Denial of Service still the king of network attacks
    McAfee Q3 2014 - Top Network Attacks.jpgDenial of Service attacks are still most prevalent but aren’t necessarily the most impactful.  As attackers leverage other tools and methods to achieve their objectives, the mix will shift and DOS attacks will wane.  Will you and your organization be ready as attacks change to more effective ways to cause harm?  Security is an ongoing endeavor and planning for the future is a requirement for sustaining a strong posture.  Past successes won’t stop attackers in the future.  As Sun Tsu said over 2 thousand years ago, persistence is not important in combat, only victory.  Think ahead and prepare for how the threats will evolve.  It is your move.

 

Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

My Blog: Information Security Strategy

 

Read more >

Keeping Patient Data Safe from Evolving Threats

The healthcare industry’s digital transformation calls for shifting the burden of care from the system to the patient. Technology is helping to lead this charge, as evidenced by the growing number of patients who are now able to track their own health information as well as generate data that previously was unavailable to physicians and other care providers. With the 2nd Annual Healthcare Cyber Security Summit this month – and the attack vectors targeting the industry having changed over the past couple years – it’s a good time to revisit the topic.

 

Mobile devices, EMRs, HIEs, cloud computing, telemedicine and other technologies are now common to healthcare settings, incrementally delivering on their promise to stretch resources and lower costs. But along with these new capabilities come new threats to patient data and the organizations responsible for managing it. Such threats are reflected through the rise of HIPAA data breaches from 2012-2013, as well as in the increase of state- and corporate-sponsored cyber attacks targeting medical device makers in 2014. As a recent webinar presented by NaviSite pointed out: the emerging Internet of Things (IoT) also raises the stakes for healthcare organizations, as reflected by Europol’s recent warning about IoT and the FDA’s determination that some 300 medical devices are vulnerable to attack.

 

In April, the FBI issued a sobering notification to healthcare organizations stating that the industry is “…not technically prepared to combat against cyber criminals, basic cyber intrusion tactics, techniques and procedures…” Nor is it ready for some of the more advanced persistent threats facing the industry.

 

It doesn’t help that medical records are considered up to 50 times more valuable on the black market than credit card records.

 

Whether through HIPAA data breaches, malware, phishing emails, sponsored cyber-attacks, or threats surrounding the evolving Internet of Things, the emerging threats in healthcare cannot go unaddressed. Security experts say cyber criminals increasingly are targeting the industry because many healthcare organizations still rely on outdated computer systems lacking the latest security features.

 

With so many mobile and internet-connected devices located in healthcare settings, determining how to secure them should be a top priority. That means developing and implementing strategies that make anti-virus, encryption, file integrity and data management a top priority.

 

Security experts report that, ultimately, data correlation is the key. What is important for healthcare organizations is having a system in place that empowers threat identification, classification, system analysis, and a manual review process that offsets human error, enabling 100 percent certainty regarding potential incidents.

 

With this in mind, how is your organization safeguarding against cyber threats? Do you rely on an in-house cybersecurity team, or has your organization partnered with a managed security service provider for this type of service?

Read more >