Recent Blog Posts

Using Wearable Technology to Advance Parkinson’s Research

25b0142.jpg


Guest Contributor, Moran Peri – Predictive Analytics Project Manager and Team Lead at Intel. 


Moran specializes in Data Mining and Machine Learning Algorithms solutions, from inception through post-implementation monitoring, in the area of chips, manufacturing and testing. 

 

At least one million Americans and five million people worldwide are coping with Parkinson’s disease (PD).  As of today, no objective diagnosis test exists, nor is there a cure. There is so much about PD that defined as mystery.

 

Hundreds of skilled neurologists, mathematicians and data analysts across the globe are looking for at home real-life PD data set, to exercise their knowledge expertise and come up with innovative research directions.


Intel is working with The Michael J. Fox Foundation for Parkinson’s Research (MJFF) to enable breakthroughs in Parkinson disease research through wearable and big data analytics technologies. A large, virtual data-gathering study is executed these days, gathering sensory data from the PD patients; analyze that data to identify patterns and make generalizations; use insights gained to accelerate the development of therapeutic breakthroughs, and potentially even a cure, for the disease. ~130 Patients in the United States are already streaming sensory date, and amount of data hours gathered so far exceeds 65,000 hours


Utilizing Wearable Technology for PD research


MJF_1.pngFox Insight smartphone application, currently available on the Android platform, is connected to a Pebble smartwatch application (Fox Insight Wear) that the users wear on their wrist. The app is designed to bring value to patients on their day-to-day life. The application lets the patient report their medications usage and how they feel, by providing them with an electronic diary that logs their personal subjective overall state. Fox Insight Mobile app also let the patients add medication reminders. Armed with personalized information and graphs as well as medication history provided by the App, patients are able to track and monitor their activity levels, tremor, and night time activity, allowing them to micromanage their regimen to suit their personal preferences and needs. Patients’ time-stamped records of behavior will help researchers correlate patients’ activity, feelings, and medications, to meaningful hypotheses that can later be tested through normal scientific methods.

 

IoT at the service of PD research


Data is our most valuable asset. The more data, the better. But, it also presents a greater challenge: The wearable internal measurement units (IMU) are capable of recording 150 to 300 samples of sensory data per second per user. With hundreds and thousands of concurrent users streaming data for months and years, there’s an obvious requirement for a system that can collect, store and process massive amounts of data.  To fulfill this requirement, we use IoT framework. The Big data tools used in the platform are all scalable and include, among others, a messaging framework (Mosquitto MQTT broker and the AKKA toolkit that allows for distributed parallel processing) Big data storage based on Cloudera distribution for Hadoop, and application interface layer based on the Play framework.

 

MJF2.png

 

Intel’s contribution to science & research


At Intel, we believe that piloting a Big Data approach to treatment discovery will highlight the possibilities to the healthcare industry. Providing a secure archive of patients’ data, as well as our contributions to algorithm design may help accelerate first stage discoveries.

Although the immediate goal is to improve the quality of care for PD suffers and lead clinical research scientists to potential cure, we believe that the tools, methods and algorithms should be applicable to clinical trials for other afflictions and for other scientific discoveries in general.

 

To get involved or to learn more about this study, visit https://www.michaeljfox.org/fox-insight-form.html.

Read more >

Intel Champions Russian Tech Innovations at IoT Ignition Lab in Moscow

Rooted in a long history of successful collaboration in scientific and technological innovations between Russia and the West, Intel and its ecosystem partners celebrated the opening of a new Intel IoT Ignition Lab in Moscow this summer. The Internet of … Read more >

The post Intel Champions Russian Tech Innovations at IoT Ignition Lab in Moscow appeared first on IoT@Intel.

Read more >

Cloud-Aware Apps Hold the Keys to the Ultimate Data Center

Are you ready to continue the journey to software-defined infrastructure? In an earlier post, I explored Two Key Stops along the Road to SDI: Automation and Orchestration. These stops are essential milestones in the trip to the ultimate destination: an SLA Managed data center.


driving-photo.jpg

 

In the SDI maturity model, the Automation and Orchestration stages feed into the SLA Managed stage, but the truth is they alone won’t get you there. To get to your final destination, your applications must be written to take full advantage of a cloud environment. Cloud-aware apps are like the vehicles on the road to the SLA Managed data center.

 

In more specific terms, cloud-aware apps know what they need to do to fully leverage the automation and orchestration capabilities of the SDI platform. They are written to allow for expansion and contraction automatically to maintain optimal levels of performance, availability, and efficiency. They understand that in a cloud environment, there are multiple routes to a database and multiple systems available to process data. They, in essence, do not worry about redundancy as the automation and orchestration will manage it in the environment.

 

 

This is quite unlike the conventional approach to apps. Most of today’s apps are tightly coupled with a particular database and a certain set of infrastructure resources. They require items such as session persistence and connection management. If any of the links break—for example, the app loses its connection to the database—the app goes down and IT admins go into fire-drill mode as they scramble to bring the app back online.  Over the past 20 years, we have done our best to automate the fire drill.

 

In a metaphorical view, we’re talking about the difference between baseball and football. In baseball, things pretty much proceed in a linear and predictable manner. There are few moving parts—there’s one pitcher throwing to one batter—and aside from the occasional base-stealer you pretty much know where all the players are at all times. This is the way things work with the conventional app.

 

In a cloud environment, things are more football-like. The players are all over the place and the same play can unfold in very different ways. When a receiver runs the wrong route, the play doesn’t come to a stop. The quarterback simply looks for other receivers who are in position to make a play. The cloud-aware app functions like a quarterback who improvises to keep the ball moving down the field.

 

Here’s where things get harder. It’s not a trivial undertaking to make apps cloud-aware. In the case of legacy apps, the code has to be pretty much rewritten from top to bottom to build in cloud-awareness or the legacy part needs to be wrapped in services so that cloud aware can happen around the legacy portion. So we’re talking about a lot of heavy lifting for your software developers.

 

The good news is you don’t have to do all of this heavy lifting at once. We’re still quite some time away from the day of the SLA Managed data center. We have to first build the integrated orchestration platforms and automation toolsets that enable a software-defined approach to the data center. The key is to understand that this day is coming, and begin taking steps to make your apps cloud-aware.

 

Any new apps should be written to be cloud-aware. As for your legacy apps, you won’t be able to rewrite them all at once, so you’re going to need to identify the apps that are most likely to benefit from cloud awareness as you move to software-defined infrastructure or just wrap them in services.

 

The wrapped applications can help move many critical apps to a more cloud-like environment without rewriting a lot of code. But those apps won’t be able to benefit from all of the goodness of an SLA Managed data center. In an SLA Managed world, software-defined infrastructure and the apps work in tandem to deliver optimal performance with minimal downtime.

 

These gains are made possible by the ability of the orchestration platform to move workloads and supporting resources around on the fly to meet the policies you set for your applications. When demand spikes, the SDI environment grabs the resources the app needs to keep performance in line with the required service levels, even if that means bursting to a public cloud to gain additional processing power.

 

If this sounds like IT nirvana, you’ve got it. In the SLA Managed data center, application downtime will be rare, and unpredictable application performance will seem more like a problem from the past than a constant threat in the present. You’ll be able to breathe easier when unusually large crowds of holiday shoppers converge on a particular app, because you’ll know that the backend systems will take care of themselves.

 

So that’s the 30,000-foot view of the last stretches of the road to SDI. If you consider where we are today and where we need to travel, you can see that we are talking about a long road, and one that can have many unique twists and turns. The key is to think about how you’re going to get to SDI, identify the vehicles that will move you forward, and then begin your journey.

 

Find me @EdLGoldman, and share your thoughts and comments.

Read more >

Amplify Your Value: Take a Journey to the Cloud!


Amplify Your Value (6).png

“Hello, my name is Jeff Ton and it has been one thousand, two hundred and seventy two days since I last opened Outlook.”


February 6, 2012, an historic date in Indianapolis, Indiana. Yeah, there was some little football game that night, Super Bowl XLVI – New York Giants against the New England Patriots. But that is not the event that made the date historic (though it was great to watch a Manning beat Brady!) What made that date historic was our go-live on Google Apps, our first step in our Journey to the Cloud.


Now that I have offended everyone from the Pacific Northwest and New England, let me rewind and start at the beginning. In 2010, I arrived at Goodwill Industries of Central Indiana. We were running Microsoft Exchange 2003 coupled with Outlook 2010. Back in the day, the adage was “No one ever got fired for buying IBM”, I was in the “No one ever got fired for buying Microsoft camp”. In fact, when I learned the students in our high school were using Google, I was pretty adamant that they use Office. After all, that is what they will be using when they get jobs!


At about this same time, we were switching from Blackberry to Android-based smartphones. We were having horrible sync problems between Exchange and the Androids using ActiveSync. We needed to upgrade our Exchange environment desperately!


As we were beginning to make plans for upgraded servers to support the upgraded exchange environment, I attend my first MIT Sloan CIO Symposium in Boston. Despite the fact that I bleed Colts blue, I actually love Boston, the history, the culture, the vibe; but I digress. At the conference I learned about AC vs. CF projects (see: That Project is a Real Cluster to learn more). I could not fathom a more likely CF project than an email upgrade project. Why not look to the cloud? Since we were doing an upgrade anyway, perhaps this would be the LAST email upgrade we would have to do!


Enter The Google Whisper. For months a former colleague-turned-Google-consultant had been telling me we should check out Google as an email platform. Usually my response was “Google? That’s for kids not an enterprise!” (Ok, now I have offended everyone from Silicon Valley, too!) Everytime I saw him, he would bring it up. I finally agreed to attend one of Google’s roadshow presentations. I came away from that event with an entirely different outlook (pun intended) on Google.


We decided to run a A/B pilot. We would convert 30 employees to the Google Apps platform for 60 days. We would then convert the same 30 employees to BPOS (predecessor to Office 365) for 60 days and may the best man, er, I mean platform, win. We handpicked the employees for the pilot. I purposely selected many who were staunchly in the Microsoft camp and several others who typically resisted change.


At the end of the pilot an amazing thing happened. Not one person on the pilot team wanted to switch off of Google onto BPOS, in fact, each and every person voted to recommend a Google migration to the Executive Team. Unanimous! When was the last time that ever happened in one of your projects?!!?


The decision made, we launched the project to migrate to the cloud! We leveraged this project to also implement our email retention policy (email is retained for five years). The vast majority of the work in the project involved locating all the .PST in our environment, moving them to a central location from network file folders, local drives, and yes, even thumb drives and CDs. Once in that central location, they were uploaded to the Google platform. During this time, we also mirrored our email environment so every internal and external email also went to the Google platform in real time.


The process took about three months, but finally it was Super Bowl Sunday, time for go-live. Now before you think me an ogre of a boss for scheduling a major go-live for Super Bowl Sunday, I should tell you, the date of February 6, 2012 was selected by the project team. Their thought? No one is going to be doing email after the game is over. We announced a blackout period of eight hours beginning at midnight to do our conversion. Boy, were we ever wrong about the length of the blackout period! Our conversion that night took about 20 minutes. 20 minutes and email was flowing again in and out of the Google environment.


Our implementation included email, contacts, calendar, and groups for three domains. We made the decision to keep the other Google Apps available, but not promote them. We also implemented our five year archive and optional email encryption for sensitive communications. The other decision we made (ok, I made) was not to allow the use of Outlook to access Gmail. One of the tenets of our strategic plan was “Any time, Any place, Any device”, I felt having a piece of client software violated that tenet and created additional support issues that were not necessary.


We learned several things as a result of the project. First, search is not sort. If you have used Gmail, then you know there is not a way to sort your Inbox, it relies instead on the power of Google Search. People really like their sort. Took some real handholding to get them comfortable.


Second, Google Groups are not Distribution Lists. We converted all of our Exchange Distribution Lists to Groups. Yes, they do function in somewhat the same way, however, there are many more settings in Groups, settings that can have unexpected consequences. Consequences like the time our CFO replied to an email that had been sent to a Group, and even though he did not use reply all, his reply went to everyone in the Group! We found that setting very quickly and turned it off! (Sorry Dan!)


The third lesson learned was “You cannot train enough”. Yes, we held many classes during the lead up to conversion and continued them long afterwards. A lot of the feedback we had heard (“everyone has Gmail at home, we already know how to use it”) led us to believe once the initial project was complete we didn’t need to continue training. We recently started a series of Google Workshops to continue the learning process. Honestly, I think some of this is generational. Some love to click on links, watch a video, and then use the new functionality. Others, really want a classroom environment. We now offer both.

One of the things that pleasantly surprised us (well, at least me) was the organic adoption of other Google tools. The first shared Google Doc came to me from outside the IT department. The first meeting conducted using Google Hangouts came from the Marketing department. People were finding the apps and falling in love with them.


Today, one thousand, two hundred and seventy-two days later our first step to the cloud is seen as a great accomplishment. It has saved us tens of thousands (if not hundreds of thousands) of dollars, thousands of hours, and has freed up our team to work on those AC projects!


Before I close, I do want to say, we are still a Microsoft shop. We have Office, Windows, Server, SQL Server and many other Microsoft Products. This post is not intended to be a promotion of one product over another. As I said in my previous post, your path may be different from ours. For us, a 3,000 employee non-profit, Google was the right choice. You may find it meets your requirements, or you may find another product is a better fit. The point here is not the specific product, but the product’s delivery method…cloud…SaaS. The project was such a resounding success, we changed one of our Application Guiding Principles. We are now “cloud-first” when selecting a new application or upgrading an existing one. In fact, almost all of the applications we have added in the last three and half years have been SaaS-based, including Workday, Domo, Vonigo, ETO, Facility Dude and more.


Go and Get Your Google On!

Go and get your Google on, later hit your Twitter up

We out workin’ y’all from summer through the winter, bruh

Red eye precision with the speed of a stock car

You’re now tuned in to some Independent Rock Stars


Next month, we will explore a project that did more to take us to a Value-add revenue generating partner than just about any other project. Amplify Your Value: Reap the Rewards!


The series, “Amplify Your Value” explores our five year plan to move from an ad hoc reactionary IT department to a Value-add revenue generating partner. #AmplifyYourValue


We could not have made this journey without the support of several partners, including, but not limited to: Bluelock, Level 3 (TWTelecom), Lifeline Data Centers, Netfor, and CDW. (mentions of partner companies should be considered my personal endorsement based on our experience and on our projects and should NOT be considered an endorsement by my company or its affiliates).


Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.


Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out more of his posts on Intel’s IT Peer Network

Read more from Jeff on Rivers of Thought

Read more >

An Omni-Channel Think Tank at FIT

Woman-talking-on-phone-and-using-tablet-in-retail.jpgI had the privilege of representing Intel at the Fashion Institute of Technology’s (FIT) Symposium on Omni Retailing in New York in April.

 

And the privilege of listening to several industry leaders and – of great interest – a team of FIT’s top senior students, who presented their vision for the store of tomorrow.

 

Some common threads:

  • We’re living in a world of digital screens – brands can either get on board or get left behind.
  • Brand success is as much about effective storytelling as it is about product and operational efficiency. And the best brands tell their stories across the screens.
  • When it comes to the millennial shopper, it’s about authenticity and trust.

 

And, of course, technology is the thread that runs through it all.

 

Highlights

 

Jennifer Schmidt, Principal and leader of the Americas Apparel Fashion and Luxury practice at McKinsey & Company, emphasized the importance of storytelling in this important global segment. According to Ms. Schmidt, 50 percent of value creation in fashion and luxury is about perception – the ability of a brand to consistently deliver (in every facet of the business) a differentiating, conversation-building, relationship-building story.

 

(Those who joined Dr. Paula Payton’s NRF store tour in January will remember her emphasis on storytelling and narrative).

 

  1. Ms. Schmidt also spoke to three elements of import in her current strategy work:
    • The change in the role of the store – which now shifts from solely emphasizing transactions to brand-building – and with 20-30% fewer doors than before;
    • The change in retail formats – which, in developed world retailing, now take five different shapes: 1) flagship store, 2) free-standing format, 3) mini- and urban-free standing, 4) shops within shops and 5) outlet;
    • The importance of international expansion, especially to the PRC and South Asia.

 

Daniella Yacobovsky, co-founder of online jewelry retailer Baublebar, also noted the importance of brand building – and she explained that her brand story is equal parts product and speed. Baublebar works on an eight-week production cycle, achieving previously unheard of turns in jewelry. Data is Ms. Yacobovsky’s friend – she tracks search engine results, web traffic and social media to drive merchandising decisions.

 

And, last but certainly not least: FIT seniors Rebeccah Amos, Julianne Lemon, Rachel Martin and Alison McDermott, winners of FIT’s Experience Design for Millennials Competition, opined on what makes the best brand experience for millennials. Their unequivocal answer – paired with a lot of good, solid retailing advice – was videos and music.

 

It’s not just about entertainment. It’s also an issue of trust and authenticity (does a brand’s playlist resonate with you?), which ultimately leads to brand stickiness.

 

Envision video – and lots of it. On enormous, in-store video walls, on mobile, hand-held devices and on brand YouTube channels. To display products virtually or provide information on how to wear or accessorize them. With in-store video, retailers can orchestrate, curate and simplify, giving shoppers a fast, trusted way to be on trend.

 

Music? The students suggested that every brand needs a music director. Brand-right soundtracks and playlists and connections to the right bands and music events can be powerful influences on today’s largest consumer group.

 

Quite the day.

 

Jon Stine
Global Director, Retail Sales

Intel Corporation

 

Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

 

* Other names and brands may be claimed as the property of others.

 

© 2015 Intel Corporation

Read more >

10 Mobile BI Strategy Questions: Design

two-men-projecting-image-on-tablet-to-screen.pngWhen the term design is used in mobile business intelligence (BI), it often refers to the user interface (UI). However, when I consider the question of design in developing a mobile BI strategy, I go beyond what a report or dashboard looks like.

 

As I wrote in “Mobile BI” Doesn’t Mean “Mobile-Enabled Reports,” when designing a mobile BI solution, we need to consider all facets of user interactions and take a holistic approach in dealing with all aspects of the user experience. Here are three areas of design to consider when developing a mobile BI strategy.

 

How Should the Mobile BI Assets Be Delivered?

 

In BI, we typically consider three options for the delivery of assets: push, pull, and hybrid. The basic concept of a “push” strategy is similar to ordering a pizza for home delivery. The “users” passively receive the pizza when it’s delivered, and there’s nothing more that they need to actively do in order to enjoy it (ok, maybe they have to pay for it and tip the driver). Similarly, when users access a report with the push strategy, whether through regular e-mail or mobile BI app, it’s no different than viewing an e-mail message from a colleague.

 

On the other hand, to have pizza with the pull strategy, users need to get into their cars and drive to the pizza place. They must take action and “retrieve the asset.” Likewise, users need to take action to “pull” the latest report and/or data, whether they log on using the app or mobile browser. The hybrid approach employs a combination of both the push and pull methods.

 

Selecting the right delivery system for the right role is critical. For example, the push method may be more valuable for executives and sales teams, who travel frequently and may be short on time. However, data updates are less frequent with the push method, so accessing the latest data can’t be critical if you choose this option. In contrast, the “pull” strategy may be more appropriate for analysts and customer service teams, who depend on the latest data.

 

Additional considerations include data security and enterprise mobility. Does the current BI solution or software support both options? Can the integrity of data security be maintained if data assets are delivered outside the demarcation lines (for example, mobile BI report delivered as an attachment to an e-mail)?

 

What Are the Format and Functionality of the Mobile BI Assets?

 

The format deals with the type and category of the asset that is delivered to mobile BI users. What does the end-user receive? Is it a static file in Adobe PDF or Microsoft Excel format with self-contained data, or is it dynamic such as a mobile BI app that employs native device functionality? Is the format limited to data consumption, or does it allow for interactions such as “what-if” scenarios or database write-back capability?

 

If the format supports exploration, what can I do with it? Can I select different data elements at run time as well as different visualization formats? How to I select different values to filter the result sets, like prompts? Does the format support offline viewing? Is the format conducive to collaboration?

 

Does the User Interface Optimize the BI Elements?

 

The UI represents the typical BI elements that are displayed on the screen: page layout, menus, action buttons, orientation, and so on. When you consider the design, decide if the elements really add value or if they’re just pointless visualizations like empty calories in a diet. You want to include just the “meat” of your assets in the UI. More often than not, a simple table with the right highlighting or alerts can do a better job than a colorful pie chart or bar graph.

 

In addition, the UI covers the navigation among different pages and/or components of a BI asset or package. How do the users navigate from one section to another on a dashboard?

 

Bottom Line: Design Is Key for the User Experience

 

The end-to-end mobile BI user experience is a critical component that requires a carefully thought-out design that includes not only soft elements (such as an inviting and engaging UI), but also hard elements (such as the optimal format for the right role and for the right device). Designing the right solution is both art and science.

 

The technical solution needs to be built and delivered based on specifications and following best practices – that’s the science part. How we go about it? That’s completely art. It requires both ingenuity and critical thinking, since not all components of design come with hard-and-fast rules that we can rely on.

 

What other facets of the mobile BI user experience do you include in your design considerations?

 

Stay tuned for my next blog in the Mobile BI Strategy series

 

Connect with me on Twitter at @KaanTurnali and LinkedIn.

 

This story originally appeared on the SAP Analytics Blog.

Read more >

Meet 3rd Gen Intel® Wireless-AC 8260 (2×2 802.11ac Wi-Fi) for Windows 10

The Intel® Dual Band Wireless-AC 8260 is Intel’s 3rd Generation 802.11ac, dual band, 2×2 Wi-Fi + Bluetooth® 4.2 adapter. It’s engineered to deliver lower power consumption1, improved RF coexistence1, and complete Microsoft Windows 10 support. The M.2 1216 form factor … Read more >

The post Meet 3rd Gen Intel® Wireless-AC 8260 (2×2 802.11ac Wi-Fi) for Windows 10 appeared first on Technology@Intel.

Read more >

Young Hackers, Hipsters, and Hustlers Create Future Product Concepts in Our Summer Innovation Program

Teenagers! Not only did I used to be one, but I now have two of them living in my house—and I’m continually amazed by the unique perspective they bring to conversations and their fearlessness in trying new things. So why … Read more >

The post Young Hackers, Hipsters, and Hustlers Create Future Product Concepts in Our Summer Innovation Program appeared first on Intel Software and Services.

Read more >

Under the Hood: How Dynamic Resource Pooling Unlocks Innovation

connected-resources-circle-graphic.png

If you have watched a movie on Netflix*, called for a ride from Uber* or paid somebody using Square*, you have participated in the digital services economy. Behind those services are data centers and networks that must be scalable, reliable and responsive.

 

Dynamic resource pooling is one of the benefits of a software defined infrastructure (SDI) and helps unlock scalability in data centers to enable innovative services.

 

How does it work? In a recent installment of Intel’s Under the Hood video series, Sandra Rivera, Intel Vice President, Data Center Group and General Manager, Network Platforms Group, provides a great explanation of dynamic resource pooling and what it takes to make it happen.

 

In the video, Sandra explains how legacy networks, built using fixed-function, purpose-built network elements, limit scalability and new service deployment. But when virtualization and software defined networking are combined into a software defined infrastructure, the network can be much more flexibly configured.

 

Pools of virtualized networking, compute and storage functionality can be provisioned in different configurations, all without changing the infrastructure, to support the needs of different applications. This is the essence of dynamic resource pooling.

 

To get to an infrastructure that supports dynamic resource pooling takes the right platform. Sandra talks about how Intel is helping developers build these platforms with a strategy that starts with powerful silicon building blocks and software ingredient technology, in addition to support for open standards development, building an ecosystem, collaborating on technology trials and delivering open reference platforms.

 

It is an exciting time for the digital services economy – who knows what service will become the next Netflix, Uber or Square!

 

There’s much more to Sandra’s overview of dynamic resource pooling, so I encourage you to watch it in its entirety.

 

Read more >

Empowering Wiltshire Police Employees with Mobile Technology

police-car-in-england.jpgWhat enables you to do really great work? Motivation to do a good job and belief in what you are doing are important. You also need access to the right tools and resources — be they pen and paper, a complex software package, or your team and their expertise. And you need the freedom to decide how you are going to pull all this together to achieve your goals.

 

I’ve recently seen how Wiltshire Police Force has used technology to bring together the combination of drive, the right tools and the freedom to act. Working with Wiltshire Council, it has developed a new approach to policing that empowers staff members to decide how, when and where they work in order to best serve the local community.

 

The organization deployed 600 tablets and laptop PCs, all powered by Intel® Core™ i5 processors, placing one in each patrol vehicle and giving some to back-office support staff. The devices connect (using 3G) to all the applications and systems the officers need. This allows them to check case reports, look up number plates, take witness statements, record crime scene details, and even fill in HR appraisal forms, from any location.


It’s What You Do, Not Where You Do It


Kier Pritchard is the assistant chief constable who drove the project. He and his team follow the philosophy that “work should be what you do, not where you go”. By giving officers the flexibility to work anywhere, he’s empowering them to focus on doing their jobs, while staying out in the community.

 

“We’re seeing officers set up in a local coffee shop, or the town hall,” he said. “In this way they can keep up to date with their cases, but they’re also more in touch with the citizens they serve.”

 

The other advantage of the new model is that officers can be much more productive. There’s no more driving to and from the station to do administrative tasks. Instead, they can catch up on these in quiet periods during their shift. “This essentially means there’s no downtime at all for our officers now,” said Pritchard.

 

The introduction of this new policing approach has gone down well with Wiltshire’s officers. They’ve taken to the devices enthusiastically and are regularly coming up with their own ways of using them to improve efficiency and collaboration.

 

In addition to making the working day more productive and rewarding for its staff, the mobile devices have also made a big difference to Wiltshire residents. Specialists in different departments of the police force are able to collaborate much more effectively by sharing their findings and resources through an integrated platform, making the experience for citizens much smoother. Areas in which the devices are used have also seen an improvement in crime figures thanks to the increased police presence within the community  — for example in the town of Trowbridge, antisocial behaviour dropped by 15.8 percent, domestic burglaries by 34.1 percent, and vehicle crime by 33 percent.

 

You can read more about how the officers are using the devices to create their own ideal ways of working in this recently published case study or hear about it in the team’s own words in this video. In the meantime, I’d love to hear your views on the role of mobile technology in empowering the workforce — how does it work for you?

 

To continue this conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.


Find me on LinkedIn.

Keep up with me on Twitter.

Read more >

#TechConnect July 22 Chat Recap: “Intel® 2-in-1 and Tablet Opportunities in Education”

Thanks to all who joined the Tech Connect Chat on Wednesday, July 22 at 1 p.m. EDT/ 10 a.m. PDT. Intel’s Blake Sweeten and Kelly Boyle lead the discussion on how to leverage new cloud-based, Intel®-powered Chromebooks and technology to create a simple, streamlined … Read more >

The post #TechConnect July 22 Chat Recap: “Intel® 2-in-1 and Tablet Opportunities in Education” appeared first on Technology Provider.

Read more >