Recent Blog Posts

Is Cloud Destined to be Purely Public?

 

51 percent of workloads are now in the cloud, time to break through that ceiling?

 

 

At this point, we’re somewhat beyond discussions of the importance of cloud. It’s been around for some time, just about every person and company uses it in some form and, for the kicker, 2014 saw companies place more computing workloads in the cloud (51 percent) — through either public cloud or colocation — than they process in house.

In just a few years we’ve moved from every server sitting in the same building as those accessing it, to a choice between private or public cloud, and the beginning of the IT Model du jour, hybrid cloud. Hybrid is fast becoming the model of choice, fusing the safety of an organisation’s private data centre with the flexibility of public cloud. However, in today’s fast paced IT world as one approach becomes mainstream the natural reaction is to ask, ‘what’s next’? A plausible next step in this evolution is the end of the permanent, owned datacentre and even long-term co-location, in favour of an infrastructure entirely built on the public cloud and SaaS applications. The question is will businesses really go this far in their march into the cloud? Do we want it to go this far?

 

Public cloud, of course, is nothing new to the enterprise and it’s not unheard of for a small business or start-up to operate solely from the public cloud and through SaaS services. However, few, if any, examples of large scale corporates eschewing their own private datacentres and co-location approaches for this pure public cloud approach exist.

 

For such an approach to become plausible in large organisations, CIOs need to be confident of putting even the most sensitive of data into public clouds. This entails a series of mentality changes that are already taking place in the SMB. The cloud based Office 365, for instance, is Microsoft’s fastest selling product ever. For large organisations, however, this is far from a trivial change and CIOs are far from ready for it.

 

Connectivity-And-Cloud-Computing.jpgThe Data Argument

 

Data protectionism is the case in point. Data has long been a highly protected resource for financial services and legal organisations both for their own competitive advantage and due to legal requirements designed to protect their clients’ information. Thanks to the arrival of big data analysis, we can also add marketers, retailers and even sports brands to that list, as all have found unique advantages in the ability to mine insights from huge amounts of data.

 

This is at the same time an opportunity and problem. More data means more accurate and actionable insights, but that data needs storing and processing and, consequently, an ever growing amount of server power and storage space. Today’s approach to this issue is the hybrid cloud. Keep sensitive data primarily stored in a private data centre or co-located, and use public cloud as an overspill when processing or as object storage when requirements become too much for the organisation’s existing capacity.

 

The amount of data created and recorded each day is ever growing. In a world where data growth is exponential,  the hybrid model will be put under pressure. Even organisations that keep only the most sensitive and mission critical data within their private data centres whilst moving all else to the cloud will quickly see data inflation. Consequently, they will be forced to buy ever greater numbers of servers and space to house their critical data at an ever growing cost, and without the flexibility of the public cloud.

 

In this light, a pure public cloud infrastructure starts to seem like a good idea – an infrastructure that can be instantly switched on and expanded as needed, at low cost. The idea of placing their most sensitive data in a public cloud, beyond their own direct control and security, however, will remain unpalatable to the majority of CIOs. Understandable when you consider research such as that released last year stating that only one in 100 cloud providers meets EU Data Protection requirements currently being examined in Brussels.

 

So, increasing dependence on the public cloud becomes a tug of war between a CIO’s data burden and their capacity for the perceived security risk of the cloud.

 

Cloud Creep

 

The process that may well tip the balance in this tug of war is cloud’s very own version of exposure therapy. CIOs are storing and processing more and more non-critical data in the public cloud and, across their organisations, business units are independently buying in SaaS applications, giving them a taste of the ease of the cloud (from an end user point of view, at least). As this exposure grows, the public cloud and SaaS applications will increasingly prove their reliability and security whilst earning their place as invaluable tools in a business unit’s armoury. The result is a virtuous circle of growing trust of public cloud and SaaS services – greater trust means more data placed in the public cloud, which creates greater trust. Coupled with the ever falling cost of public cloud, eventually, surely, the perceived risks of the public cloud fall enough to make its advantages outweigh the disadvantages, even for the most sensitive of data?

 

Should It Be Done?

This all depends on a big ‘if’. Trust in the public cloud and SaaS applications will only grow if public cloud providers remain unhacked and SaaS data unleaked. This is a big ask in a world of weekly data breaches, but security is relative and private data centre leaks are rapidly becoming more common, or at least better publicised, than those in the public cloud. Sony Pictures’ issues arose from a malevolent force within its network, not its public cloud based data. It will take many more attacks such as these to convince CIOs that losing direct control of their data security and putting all that trust in their cloud provider is the most sensible option. Those attacks seem likely to come, however, and in the meantime, barring a major outage or truly headline making attack on it, cloud exposure is increasing confidence in public cloud.

At the same time, public cloud providers need to work to build confidence, not just passively wait for the scales to tip. Selecting a cloud service is a business decision and any CIO will lend the diligence that they would any other supplier choice. Providers that fail to meet the latest regulation, aren’t visibly planning for the future or fail to convince on data privacy concerns and legislation will damage confidence in the public cloud and actively hold it back, particularly within large enterprises. Those providers that do build their way to becoming a trusted partner will, however, flourish and compound the ever growing positive effects of public cloud exposure.

 

As that happens, the prospect of a pure public cloud enterprise becomes more realistic. Every CIO and organisation is different, and will have a different tolerance for risk. This virtuous circle of cloud will tip organisations towards pure cloud approaches at different times, and every cloud hack or outage will set the model back different amounts in each organisation. It is, however, clear that, whether desirable right now or not, pure public cloud is rapidly approaching reality for some larger enterprises.

Read more >

Cold Gas Reaction Control System with Intel Edison

Mechanical Engineering students from Maseeh College of Engineering and Computer Science at Portland State have posted an impressive Intel Edison-based solution to greatly reduce the cost of Cold Gas Reaction Control Systems for rockets using affordable components. According to notes … Read more >

The post Cold Gas Reaction Control System with Intel Edison appeared first on Intel Software and Services.

Read more >

It Looked Good On Paper

When it comes to technology and automation for data centers IT is not a hard sell.  But sometimes it pays to step back and take a second look.  For example:  We worked with an enterprise who was well underway in their build out of a new … Read more >

Smart Buildings: Intel IoT Platforms Open Doors to Innovation at IBCon

Intel Internet of Things (IoT) solutions provided the building blocks for connected, smart buildings at the Intelligent Building Conference (IBCon) in San Antonio, Texas, last week. IBCon featured building management solutions based on open platforms like the Intel IoT Gateway, … Read more >

The post Smart Buildings: Intel IoT Platforms Open Doors to Innovation at IBCon appeared first on IoT@Intel.

Read more >

3 Strategies to Get Started with Mobile BI

Man-On-Subway-Reading-Tablet.pngIn my post, “Mobile BI” Doesn’t Mean “Mobile-Enabled Reports,” I highlighted two main areas that affect how organizations can go about realizing the benefits of mobile BI: enterprise mobility and BI maturity.

 

Today I want to focus on the latter and outline high-level strategies that require different avenues of focus, time, and resources.

 

Before an organization can execute these high-level strategies, it must have the following:

 

  • An existing BI framework that can be leveraged
  • Current technology (hardware and software) used for BI that support mobile capabilities
  • A support infrastructure to address technical challenges

 

If an organization meets these minimum prerequisites, then there’s a greater chance for success. Thus, the higher the level of BI maturity, the better a head start an organization gets on its mobile BI journey.

 

“Mobile-Only” Strategy

 

A “mobile-only” strategy reflects a strong commitment, or all-in approach, by the management team to mobile BI, or mobility in general. This may be due to a specific reason, such as the relevance of mobility in a particular industry or the opportunity to create a strategic advantage in a highly competitive market. Or a company may decide that mobility needs to be a vital part of their vision.

 

However, in order for this strategy to be successful, it requires a commitment that results in both championing the cause at the board or senior management level and making the necessary resources available for execution at the tactical level.

 

In reality, this approach doesn’t necessarily translate into creating a mobile version of every analysis or shutting down all production lines for PC-based outlets for reporting and analytics. Instead, it reflects a strong emphasis on establishing scalable mobile consumption paths for analytics, and it signals a willingness to exploit a mobile-first mindset.

 

Mobile-BI-Strategy.png

 

“Key Asset(s) First” Mobile Strategy

 

Organizations that aren’t ready or don’t have the resources for a mobile-only strategy may be forced to pursue a less ambitious approach. This would enable such organizations to supplement their existing BI portfolio with key analyses delivered in mobile BI, resulting in a smaller initial investment and reduced pressure to overhaul large stacks of assets, so to speak.

 

With the “key-assets-first“ strategy, the spotlight is on finding key BI areas of focus that can both return the maximum value when delivered effectively on mobile platforms, and can directly support the execution of the business strategy in the short-term. For example, the business strategy may include expansion into a new market, and the mobile BI may deliver analytics to help sales teams to sell more and provide management with insight into forecast and pipeline.

 

To me, this is the most flexible strategy because it doesn’t commit to an all or nothing approach. Most importantly, it differentiates between what may be conducive to mobile-ready consumption and what can produce the maximum impact by ignoring those assets with marginal returns on investment.

 

“Key Group(s) First” Mobile Strategy

 

A “key-group-first” strategy makes a considerable commitment to arm a particular group or groups in an organization with a complete set of capabilities that can be delivered in mobile BI. This hybrid strategy identifies the best candidate group(s) for mobile BI and delivers an end-to-end solution. At minimum, it may consider the existing BI framework, the BI culture (history in terms of successes and failures), the BI adoption across the enterprise, and the current BI asset portfolio to develop this more comprehensive approach.

 

For example, sales teams, which travel and spend a lot of time in the field, tend to benefit most from mobility. If they’re selected as the target group, the mobile BI strategy’s goal will be to provide them with a comprehensive package of sales-centric BI assets. Thus, existing capabilities in enterprise mobility for sales teams may compliment not only the delivery of new mobile BI resources but also the greater adoption of mobile BI content.

 

Mobile BI Bottom Line

 

The fundamentals don’t change — a smart mobile BI strategy needs to contribute to growth or to profitability. In order to deliver the true business value for mobile BI, all three strategies must embrace the common objectives of an integrated mobile intelligence framework. They must leverage the technology’s strengths as well as minimize its weaknesses within a supported infrastructure.

 

The mobile intelligence framework can’t exist separately from, or independent of, the organization’s business or technology strategy.

 

What is your start-up strategy for mobile BI?

 

Stay tuned for my next blog in the Mobile BI Strategy series.

 

Connect with me on Twitter at @KaanTurnali and LinkedIn.

 

This story originally appeared on the SAP Analytics Blog.

Read more >

Part II: Future Health, Future Cities – Intel Physical Computing Module at IDE

by Chiara Garattini & Han Pham

 

In the first of our Future Health, Future Cities blog series, we posed questions around the future of health in the urban environment. Today we look at some of the projects undertaken by students on the physical computing module on the Innovation Design and Engineering Masters programme run in conjunction between Imperial College and the Royal College of Arts (RCA).

 

Health and Safety in the Workplace

The first group of project is related to the important issue of health and safety in the workplace.

2 - a - Health and Safety in the Workplace.png

Figure 1. Circadian Glasses


Christina Petersen’s ‘circadian glasses’ considered the dangers of habitual strains and stressors at work, particularly for individuals in careers with prolonged evening hours or excessively in light-poor conditions, which may have a cumulative effect on health over time. Although modern technologies allow for the convenience of working at will regardless of external environmental factors, what is the effect on the body’s natural systems? In particular, how does artificial lighting affect the circadian rhythm?

 

Her prototyped glasses use two LED screens that can adjust the type of light to help users better adjust their circadian rhythms and sleep patterns. The concept also suggests a potentially valuable intersection of personal wearable and personal energy usage (lighting) in the future workplace. Unlike sunglasses, the glasses are also a personal, portable source of light – an interesting concept in workplace sustainability, given the majority of energy expenditure is in heating/cooling systems and lighting.

 

While there is room to make the user context and motivation more plausible, the prototype literally helps shed light on meaningful, and specific, design interventions for vulnerable populations such as nurses or night shift workers for personal and workplace sustainability over time.

 

2 - b - Smart Workplace Urinal.png

Figure 2. Smart Workplace Urinal


As we often see within our work, a city’s hubs for healthcare resources and information often are informally ubiquitous and present within the community before one reaches the hospital. Jon Rasche’s smart urinal was created to decrease the queue and waiting time at the doctor’s office even before you arrived, by creating more personal, preventative care via lab testing at the workplace.

 

The ‘Smart Urinal’ created an integrated service with a urinal-based sensor and a display unit, QR codes, and a mobile application (Figure 2). The system also considered concerns around patient privacy by intentionally preventing private patient information from entering the cloud. Instead, each of the possible results links to a QR codes leading to a static web page with the urinalysis information.

 

While the system might be perceived as too public for comfort, it connects to the technological trends toward for more personalised and accessible testing (Scanadu’s i-Phone ready urinalysis strip is a good example). It also raises the consideration of how to design for the connected ecosystem of responsibility, accountability and care – how can different environments influence, impact and support an individual’s wellbeing? How can personalised, connected care be both anticipatory, preventative, and immediate, yet, private?

 

Pollutants Awareness

The dynamic life of a city often means it’s in a state of constant use and regeneration – but many of the resulting pollutants are invisible to the naked eye. How do we know when the microscopic accumulation of pollutants will be physically harmful? How can we make the invisible visible in a way that better engages us with our environment?

 

2A.jpg

Figure 3. Air Pollution Disk


Maria’s Noh’s ‘Air Pollution Disc’ (Figure 3) considers how we can design for information to be more physical, visible and intuitive by creating a mechanical, physical filter on our immediate environment driven by local air quality data using polarised lenses.

 

It’s a very simple mechanism with an elegant design that ties to some of our earlier cities research into perceptual bias around air quality substituting numeric data for physical feedback (e.g., although pollutants may not always be visible, we equate pollution with visual cues). Noh suggested two use scenarios – one was to affix device to a window of a home to understand pollution at potential destinations, such as the school; another was to potentially influence driver behaviour by providing feedback on relationship of driving style to pollution.

 

While there are some future nuances and challenges to either case, the immediacy of the visualisation for both adults and children, may make it interesting to see the Air Pollution Disc as a play-based, large-scale urban installation of physicalizing the hidden environment of the city.

 

Ghost.jpg

Figure 4. Ghost 7.0


The pollutants category relates to the prototype for ‘Ghost 7.0’ by student Andre McQueen, a smart clothing system that addresses how weather and air quality affect health. The idea tries to tackle breathing problems, e.g. due to allergies, associated to weather changes. The device (Figure 4) embedded in the running clothing is designed to communicate with satellites to receive updates on weather conditions and signal warnings under certain circumstances.

 

When a significant meteorological change is signalled, the fabric would change colour and release negative ions (meant to help breathing under certain conditions). The student also investigated oxidisation to fight pollutants, but could not overcome the problem of the releasing some small amounts of CO2.

 

What we found interesting in this project was the idea that a wearable device would do something to help against poor air quality, rather than just passively detecting the problem. Too many devices currently are focusing on the latter task, leaving the user wondering about the actionability of the information they receive.

 

Glove.jpg

Figure 5. Dumpster diving ‘smart glove’


The last selected project for this section is a project on dumpster diving by student Yuri Klebanov. Yuri built a system to make dumpster diving safer (by creating a ‘smart glove’ that reacts to chemicals) and more effective (by creating a real time monitoring system that uploads snapshots of what is thrown away on a website for users to monitor).

 

While the latter idea is interesting but presents several challenges (e.g. privacy around taking pictures of people throwing away things), what we liked about the project was the ‘smart glove’ idea. The solution device was to boil fabric gloves with cabbage, making them capable of changing colour when in contact with acid, liquids, fats and so on (Figure 5). This frugal technology solution made us reflect on how smart is ‘smart’? Technology overkill is not always the best solution to a problem, and something simple is always preferable to something more complex that provides the same (or little incremental) results.

 

In the third and final blog of our Future Cities, Future Health blog series we will look at the final theme around Mapping Cities (Creatively) which will showcase creative ideas of allocating healthcare resources and using sound to produce insights into complex health data.

 

Read Part I

Read Part III

 

 

*Concepts described are for investigational research only.

**Other names and brands may be claimed as the property of others.

Read more >

Part I: Future Health, Future Cities – Intel Physical Computing Module at IDE

Intel has sponsored a physical computing module on the topics of ‘Future Health, Future Cities’ as part of the first year in the Innovation Design and Engineering Master programme in conjunction between Imperial College and the Royal College of Arts (RCA). This module, coordinated by Dominic Southgate at Imperial College, was intended to be an investigation into the future of health in urban environments and a projection of how technology might support this multi-faceted theme.

 

The Intel team (John Somoza, Chiara Garattini and Duncan Wilson) suggested themes that the 40 students (10 allocated for each theme) had to work on individually over the course of four weeks:

 

1.  Food, Water, Air

The human body can only live for three weeks without food, three days without water, and three minutes without air. These ingredients are vital for our survival and key to our good health – how can we optimise each of them within our cities?

 

Food has an obvious connection to healthy living. But what about the more subtle relationships? How can food be analysed/customised/regulated to help with specific disorders or conditions? Meanwhile, how can technology help us in water catchment and distribution in the city or manage quality? Can we reuse water better?

 

Likewise, an invisible and yet vast component of cities is its air, which is key to human well-being. While air is currently rated by proxy metrics in many ways, what is air quality and pollution through a human lens? How can we re-think the air we breathe?

2.  Systems of Systems

A city is made of many systems inextricably related and depending on each other. One important aspect of cities is its healthcare system. How can we re-imagine a city-based healthcare service? For example, hospitals are currently hubs for providing health care when needed, yet they often may not be the first or best place we seek care when unwell. Can we reimagine what a hospital of the future would look like? What would a healthcare worker of the future look like and what equipment would they use?

 

Although we currently use tools such as the healthy city index rates that rate cities as healthy and un-healthy, how could we measure a healthy city in a way which reflects its complexity? Measuring the world in a new way at some point becomes experiencing the world in a new way — what tools do we need and what are the implications?

 

Ultimately, if cities are systems of systems, then we are the nodes in those systems: how do we understand the impact of our individual accumulative actions on the larger systems? How can we see small, seemingly un-impactful actions, in their incremental, community wide scaling? How can we entangle (or disentangle) personal and collective responsibilities?

 

3.  Measuring and Mapping

There are various ways to measure a sustainable city, but none is perfect (e.g. carbon credits). What is the next thing for measuring a sustainable city? What would be the tools to do so? How local do we want our measures to be?

 

Our cities have different levels of language and communication embedded in their fabric (symbols, maps, and meanings). Some of these are more evident and readable than others, marking danger, places, and opportunities. One class of such signals relates to health. What kind of message does our city communicate in order to tell us about health? What symbols does it use and how do they originate and change through time?

 

4.  Cities of Data

Much of the current quantified-self movement is centred on metrics collected by individuals and shared with a relatively close, like-minded community. What would a ‘quantified-selfless’ citizen look like within the context of a city-wide community? How would people share data to improve their lives and that of other people? How could this impact on the environment and systems in which they live? How could the city augment and integrate people’s self-generated data and support you in an effort of being ‘healthy’ (e.g. personalised health cityscapes)? At the same time, individual and communities’ interests are sometimes harmonic and sometimes competing. How would citizens and cities of the future face this tension?

 

Commentary on selected projects

The underlying idea behind these themes was to stimulate design, engineering and innovation students to think about the complex relationship between connected cities and connected health. Because the task is wide and complex, we decided to start by pushing them to consider some broad issue, e.g., how can a city’s health infrastructure become more dynamic? How can we help cities reconsider the balance between formal/informal resourcing to meet demand? What are the triggers to help communities understand/engage with environmental/health data?

 

The aim was to encourage the upcoming generation of technology innovators to think of health and cities as vital to their work.

 

The IDE programme was ideal for the task. Imagined originally for engineers who wanted to become more familiar with design, it has now transformed into a multidisciplinary programme that attracts innovative students from disciplines as varied as design, business, fashion and archaeology. This shows a resurgence of the relevance of engineering among students, possibly stimulated by the accessibility and ubiquity of tools for development (e.g. mobile apps) as well as the desire to find solutions to pressing contemporary problems (e.g. aging population trends).

 

Students were able to explore different points of interest in Intel’s ‘Future Health, Future Cities’ physical computing module, each an interesting starting point into the challenges of designing for complex, living systems such as a city.

 

We will share eight of the projects in our next two blogs, based not on their overall quality (which was instead assessed by their module coordinators) but rather how their collective narrative under three emergent sub-themes help highlight connections to some of the ongoing challenges and questions we face in our daily work.

 

Read Part II

Read Part III

 

 

*Concepts described are for investigational research only.

**Other names and brands may be claimed as the property of others.

Read more >

Part III: Future Health, Future Cities – Intel Physical Computing Module at IDE

by Chiara Garattini & Han Pham

 

Read Part I

Read Part II

 

In the third and final edition of our Future Cities, Future Health blog series, we will look at the final theme around Mapping Cities (Creatively) which showcases the creative ideas of allocating healthcare resources and using sound to produce insights into complex health data as part of the physical computing module on the Innovation Design and Engineering Masters programme run in conjunction between Imperial College and the Royal College of Arts (RCA).


Mapping cities (creatively)

In considering how to allocate resources, we also need to understand where resources are most needed, and how this changes dynamically within a city.

 

3A.jpg

Figure 1. Ambulance Density Tracker


Antoni Pakowski asked how to distribute ambulances within a city to shorten response times for critical cases, and suggested this could be supported by anonymous tracking of people via their mobile phones. The expected service window of ambulance arrival in critical care cases is 8 minutes. However, in London, only around 40 percent of calls meet that target. This may be due to ambulances being tied to a static base station. How can the location of the ambulance change as people density changes across a city?

 

The ambulance density tracker (Figure 1) combined a mobile router and hacked Pirate Box to retrieve anonymously the IP of phones actively seeking Wi-Fi to create a portable system to track the density of transient crowds. The prototype was designed to only rely upon one point of data within a certain region, requiring less processing than an embedded phone app. He also created a scaled down model of the prototype, to suggest a future small device that could potentially be affixed to static and moving infrastructure such as taxis within the city.

 

Although the original use case needs additional design work to be clearer, the prototype itself as a lightweight, anonymous device that allows for a portable proxy of transient crowd density may be useful as a complementary technology for other design projects geared toward designing for impromptu and ad hoc health resources within a city based on audience shifts.

 

3B.jpg

Figure 2. ‘Citybeat’


The second project in this category is called ‘Citybeat’ by student Philippe Hohlfeld (Figure 2). Philippe wanted to look at the sound of a city and create not only ‘sound’ maps of the city, but also capturing the ‘heartbeat’ of a city by exploring ‘sonified’ feedback from it. His thinking originated from three distinct scientific endeavours: a) turning data from the Higgs Boson Atlas preliminary data at CERN into a symphony to celebrate the connectedness of different scientific fields; b) turning solar flares into music at the University of Michigan to produce new scientific insights; and c) a blind scientist at NASA turning gravitational fields of distant stars into sound to determine how they interact.

 

The project looked specifically at the Quality of Life Index (safety, security, general health, culture, transportation, etc.) and tried to attribute sounds to different elements so to create a ‘tune’ for each city. Sonification is good for finding trends and for comparison between two entities. What we most liked of the project though, was the idea of using sound rather than visual tools to produce insights into complex data.


Personal data from wearables, for example, is generally often in visual dashboard. Even though these are meant to simplify data fruition, they not always do. Sound could be quicker than visual displays in expressing, for example, rapid or slow progress (e.g. upbeat) or regress (e.g. downbeat). In the current landscape of information overload, exploring sound as alternative way of summarizing usage is something we thought very interesting.

3 - b - Bee Gate.png

Figure 3. ‘Bee gate’

Finally, the last selected project in this list is also one of the most unusual ones. Student James Batstone wanted to think of how bees interact with polluted environments and how they could be used as part of reclamation or decontamination programmes. He imagined a city (or territory) abandoned due to pollution, and of using bees to collect and analyse pollen to establish whether the territory was ready for being reclaimed to human habitation.

He built a prototype with ‘bee gates’ that would allow for the harmless capturing of pollen from the individuals insects when returning to the hive (Figur3). He also theorised to complement this with an automated software that used cameras to track and automatically analyse their dance to establish provenance. What we liked about this project is the imaginative idea of using bees to monitor air and land quality by analysing vegetation through their pollen, as well as radiation and pollutants in honey, to create maps of lands quality levels. Using natural resources and occurring events to complement what technology can do (and vice versa) is the way to achieve sustainable solutions in the long term.

 

Final thoughts

As part of our work at Intel, we collaborate with the world’s top universities to look at the future of cities with an eye toward the intersection of technology, environment, and social sustainability. In our groups one can find entrepreneurs, designers, hacktivists, engineers, data artists, architects and more.

 

We seek to support the same diversity of inspiration in today’s students as the future technology innovators by tapping into how to connect creativity to the technology for more vibrant, connected cities and communities. In many ways, working with first year master’s students is a refreshing perspective of how to open these questions with a beginner’s mind-set by suggesting how embrace simplicity in the face of rising information – just because our digital traces and data footprint will be increasing, our time to juggle what that means won’t.

 

Physical computing is coming into play in new ways, more often. It will not be enough to get lost in a screen – the interface of tomorrow will be everywhere, and interactions leap off screens into the real world. ‘Future Health, Future Cities’ suggested how to consider the role of physical computing in helping create more sustainable services by, for example, making transparent what and where the need for services are, by exploring how to communicate simply and well new urban information streams, and, last but not least, by reflecting on how to deliver resources where it will be most needed in a constantly changing city.

 

 

*Concepts described are for investigational research only.

**Other names and brands may be claimed as the property of others.

Read more >

Intel at Citrix Synergy 2015: Delivering a Foundation for Mobile Workspaces

From May 12-14, Citrix Synergy 2015 took over the Orange County Convention Center in Orlando, providing a showcase for the Citrix technologies in mobility management, desktop virtualization, server virtualization and cloud services that are leading the transition to the software-defined workplace. Intel and Citrix have worked [together closely] (https://www.youtube.com/watch?v=gsm26JHYIaY) for nearly 20 years to help businesses improve productivity and collaboration by securely delivering applications, desktops, data and services to any device on any network or cloud.  Operating Citrix mobile workspace technologies on Intel® processor-based clients and Intel® Xeon® processor-based servers can help protect data, maintain compliance, and create trusted cloud and software-defined infrastructures that help businesses better manage mobile apps and devices, and enable collaboration from just about anywhere.

 

During Citrix Synergy, a number of Intel experts took part in presentations to highlight the business value of operating Citrix software solutions on Intel® Architectures.

 

Dave Miller, director of Intel’s Software Business Development group, appeared with Chris Matthieu, director of Internet of Things (IoT) engineering at Citrix, to discuss trends in IoT.   In an interview on Citrix TV, Dave and Chris talked about how the combination of Intel hardware and Intel-based gateways and the Citrix* Octoblu IoT software platform make it easy for businesses to build and deploy IoT solutions that collect the right data and help turn it into insights to improve business outcomes.

 

Dave looked in his crystal ball to discuss what he saw coming next for IoT technologies. He said that IoT’s initial stages have been about delivering products and integrated solutions to create a connected IoT workflow that is secure and easily managed. This will be followed by increasingly sophisticated technologies for handling and manipulating data to bring insights to businesses. A fourth wave will shift the IoT data to help fuel predictive systems, based on the increasing intelligence of compute resources and data analytics.

 

I also interviewed David Cowperthwaite, an engineer in Intel’s Visual and Parallel Computing Group and an architect for virtualization of Intel Processor Graphics. In this video, we discussed how Intel and Citrix work together to deliver rich virtual applications to mobile devices using Citrix* XenApp.  David explained how running XenApp on the new Intel® Xeon® processor E3 v4 family  with Intel® Iris™ Pro Graphics technology provides the perfect platform for mobile delivery of 3D graphics and multimedia applications on the highly integrated, cartridge-based HP* Moonshot System.  

 

One of the more popular demos showcased in the Intel booth was the Intel® NUC and Intel® Compute Stick as zero-client devices.  Take a live look in this video.  We also released this joint paper on XenServer, take a look.

 

For a more light-hearted view of how Citrix and Intel work together to help you Work Anywhere on Any Device, watch this fun animation.

Read more >

Creating Value—and Luck—in Hospitality. Part I: Charting the Future of the Hotel Experience

I travel a lot for work, and the truth is that it can be a pretty painful experience. I spend a lot of time thinking about how travel and hospitality will be in the future, but perhaps I think about it most when I’m stuck in yet another airport, surrounded by screaming children, wishing I was anywhere but there. What new experiences are going to be available five or 10 years from now? From the moment I leave my house in the morning to the moment I order my guilty snack from room service at midnight, how will that whole process be made better? When you’ve just traveled half-way around the world, you’re dog tired, and waiting in line to check in to your hotel, there are no more important questions.

 

As luck would have it, these are just the sorts of issues Intel’s research team of trained ethnographers, anthropologists, and social scientists are exploring. And what they are finding is giving us a glimpse of an exciting retail, hospitality, and entertainment future distinguished by amazing convenience and control for guests and unprecedented opportunity for the hospitality industry.

 

Guest-Uses-Connected-Technology-In-Hotel-Room.pngIt’s All About the Customer

 

For hoteliers, the successful ones anyway, it starts with one overriding goal: Deliver the best possible guest experience. To do that means getting to know guests better, learning their likes and preferences, and then delivering high-quality services and experiences that are personalized to their needs.

 

Travelers are becoming savvier and more demanding. Only through that deeper relationship with guests can hotels expect to offer them the truly customized and personalized experiences needed to win and sustain our loyalty.

 

Customization vs. Personalization

 

Let’s start with customization. It’s different than personalization, which I’ll get to in a minute. When we’re talking about customized experiences, we are referring to the ability to deliver on the specific requests of customers: I like the top floor, I have egg white omelets and fruit each morning, I want my room kept at a steady 68 degrees. Based on those identified preferences, brands can tailor the experience to us.

 

By contrast, personalization goes one step further by anticipating what we want, and offering or providing it before we ask. Using data analytics to gather information, interpret it, and optimize the stay, hotels will be able to offer proactive personalization. That will mean providing a host of new experiences, as well as new and greater value.

 

A seamless, Integrated Experience

 

Once we start opting in to share our information, and allowing the guest to opt in to these kinds of services is critical, hotels will be able to not only deliver what we want, but also when and how we want it. By tapping into the Internet of Things (IoT), our entire journey and stay—from when and how we like to check in to running that videoconference—will be easily and seamlessly integrated into the experience without requiring that we recreate the wheel each visit.

 

Imagine checking in via your smartphone and avoiding the line, with way finding to help you navigate the property. The room is set to your ideal temperature and the TV to your preferred channel. You lay your tablet or laptop down and it immediately begins to charge wirelessly. Using a provided tablet, rather than the stained and cumbersome menu usually found on the desk, you order food in a single click. Your laptop then seamlessly connects with the TV, so you can use the larger screen to chat with family, prepare for your presentation, and then view your movies.

 

And that’s just the beginning. Ease and control will be the watchwords, and the new standards. Using technology and data, hoteliers will be creating what will look like luck or serendipity, but will really be the benefits of a deep understanding of you and your needs.

 

To see an example of what the future will look like, visit the Connected Room prototype Microsoft is unveiling, with Intel’s help, at the HITEC show (booth 2115) June 15-18 in Austin. If you want to read more about what’s coming in retail, take a look at the comprehensive white paper on The Second Era of Digital Retail which I authored last year.

Read more >

Change Your Desktops, Change Your Business. Part 3: Make IT More Effective

In the last two posts in this series, we looked at two issues we’ve all got on our radar: productivity and power savings. They’re both huge targets for today’s businesses because they speak directly to the bottom line. The next topic also translates to real dollars, and that’s IT effectiveness.

 

Now, it goes without saying that we rely on our IT departments to keep us up and running. But for them to be effective, we have to give them the tools they need to get the job done. That starts with making sure people have reliable PCs. Doing so can help IT lower costs and reduce employee downtime, while also giving them the ability to support more systems.

 

So what about those PCs? What are we talking about? The PCs included in the study we’ve been exploring in the last few desktop blog articles relied on new All-in-Ones and Mini PCs, each with the latest Intel® vPro™ technology as well as the newest version of Intel® Active Management Technology (Intel® AMT).1

 

Reduce-Repair-And-Employee-Downtime-With-Remote-KVM.pngThese new systems let IT access the graphical user interface and control the desktops remotely, no matter the power state of the system. With your older fleets, if they had non-operational desktops that were out of band, it meant sending someone to physically fix the system. If you’ve ever had to do that, you know the lost time, productivity, and cost associated. Not ideal.

 

For the study, Keyboard-Video-Mouse (KVM) Remote Control was included in the new systems, but not in the older release of Intel AMT (5.2) that was installed on the aging systems.2 The difference is response time is striking.

 

Here’s the scenario: Imagine one of your employees at a remote site calls into the help desk; her desktop is down. In the old way of doing things, a tech would be dispatched, but probably not until the next day. That results in somewhere in the neighborhood of eight hours of downtime, a painful reality for any business.

 

But the new systems explored in the study, the ones with KVM? Employees waited only 15 seconds for IT to initiate the KVM Remote Control session. Those kinds of savings are also felt in the bottom line. The study revealed that the All-in-One and Mini Desktop would reduce the cost of employee downtime by $215.08 for the 10-minute software repair. That’s a saving of nearly 98 percent.3

 

Plus, don’t forget the savings in time spent by IT. The newer desktops, combined with avoiding that travel time, cut the repair cost by $39.65, leading to a savings of some 85 percent. And the savings go up from there the older your legacy systems are. You don’t even want to know what it’s likely to cost once you’re beyond the warranty.

 

In the final installment in the series on PC refresh, let’s dive into how to actually leverage the newest technology. And don’t forget, you can review the complete study cited above.

Join the conversation using #IntelDesktop.

 

This is the third installment of the “Change Your Desktops, Change Your Business” series in the Desktop World Tech Innovation Series. To view the other posts in the series, click here: Desktop World Series.

 

1. For more information on Intel AMT, visit http://www.intel.com/content/www/us/en/architecture-and-technology/intel-active-management-technology.html?wapkw=amt

2. https://software.intel.com/sites/manageability/AMT_Implementation_and_Reference_Guide/default.htm?turl=WordDocuments%2Fkvmandintelamt.htm

Read more >