Recent Blog Posts

Change Your Desktops, Change Your Business. Part 4: Leverage the Newest Technology



Do you have a smartphone? These days, chances are pretty good that you do. So, that means using touch has probably become pretty normal for you: It’s natural, easy, and fast. Well, it’s probably no surprise that businesses are increasingly seeing the upside of bringing those same benefits to their business desktop PCs.


It’s really all about being able to work in the way that makes the most sense for you. With touchscreen displays, people can closely interact with web pages, images, videos, PDFs—all kinds of content. But then they can switch to the keyboard and mouse for typing and other tasks best suited to that interface. I think that makes a lot of sense.


The study we’ve been addressing during this series on desktops found that a touch-enabled display added about $186 to the starting price of an All-in-One PC.1 Is it worth it? Here’s a good way to look at it: If that touchscreen can lead to even one minute of additional productivity, which doesn’t seem like a stretch, it could pay for itself in under 20 months.2,3 For even more examples of how the power of touch has revolutionized desktop computing, check out the infographic here.


But touch is just one of the many innovations available to businesses today. Many companies, for example, are replacing their work PCs so that they can take advantage of the latest USB technology. The difference comes down to speed. The USB 3.0 ports in the latest All-in-One PCs and Mini Desktops offer transfer rates up to 10X faster than the USB 2.0 ports in your aging legacy desktop towers.


Then there’s DisplayPort, which is available on All-in-One PCs and Mini Desktops. Tests confirmed that it can support higher performance and lower power monitor displays than legacy systems. Plus, the DisplayPort, or HDMI (also available on the new desktop PCs), can enable you to add a second display.


New wireless technology is also available with the new All-in-One PCs and Mini Desktops in the form of Dual Band Wireless-AC 7260 cards. But they’re not available in those older desktop towers. That means more flexibility for your employees because you don’t have to worry about including an Ethernet port and cable for each desktop.


And lastly, all of that technological brilliance now comes in a significantly smaller package. To be more specific, the study found that today’s All-in-One PCs and Mini Desktops save you 59 and 60 percent, respectively, in workspace inches compared to legacy desktops.


The moral of the story, and the point of this desktop blog series, is that moving from your aging desktop fleet to newer All-in-One PCs and Mini Desktops can make a real difference for your business. From improved performance and lower energy costs, to greater IT effectiveness and access to the newest technology. So you have to ask: How could the changing my desktops change my business? Join the conversation using #IntelDesktop.


This is the fourth and final installment of the “Change Your Desktops, Change Your Business” series in the Desktop World Tech Innovation Series. To view the other posts in the series, click here: Desktop World Series.


1. The starting price ($1,598) + three-year ProSupport Service brought the price to $1,677.57 for non-touch-enabled display with Intel Core i5 process and 8 GB memory for the All-in-One PC. The starting price (including three-year ProSupport Service) for the All-in-One PC with the same processor and memory and touch-enabled display was $1,863.29 via on 12/19/2014


2. A minute a day value at $9.72 ($350/36), per month could provide payback for a $186 cost in 19.1 months.


3. Note: We tested the All-in-One PC installed with Windows 7 so that system configuration matched closely to the legacy desktop tower. The All-in-One PC and Mini Desktop are available with either Windows 7 or Windows 8.1.

Read more >

How Free Wi-Fi Can Transform the Patient Experience in the NHS

I’m often reminded that within the health IT sector we overlook some of the more simple opportunities to provide a better healthcare experience for both clinical staff and patients. A great example of this was the news that the NHS is investigating the feasibility of providing free Wi-Fi across its estate which it estimates will ‘help reduce the administrative burden currently estimated to take up to 70 percent of a junior doctor’s day‘. I’ll cover the often-talked about benefits to clinicians in a later blog but here I want to focus on how access to free Wi-Fi could impact the patient in a myriad of positive ways.


Today many of us see access to the internet via Wi-Fi just like any other utility. It’s not something we think of too deeply but we expect it to be there, all day, every day. But access to Wi-Fi in an NHS hospital can either come at a price or is not available at all. The vision put forward by Tim Kelsey, NHS England’s National Director for Patients and Information, could truly revolutionise the continuum of care experience and fundamentally change the relationship between patient/family and hospital. I’ve highlighted five of the main benefits below:


1. Enhances Education

Clinicians will say that a better informed patient is more likely to buy in to their treatment plan. Traditionally an inpatient will be delivered updates on their condition verbally by a doctor ‘doing the rounds’ once or twice per day at the bedside. With the availability of free Wi-Fi in hospitals and the much-anticipated electronic patient access to all NHS funded services by 2020, I anticipate a patient being able to simply log-in to see real-time updates about their condition at any time of the day via their electronic health record. And Wi-Fi may offer opportunities to provide access to online educational material approved by the NHS too.  I would add a cautionary note here though around the differing levels of interpretation of medical data by clinicians and patients.


2. Connecting Families

A prolonged stay in hospital affects not just the patient but the wider family too. Free Wi-Fi changes what can sometimes be a lonely and isolated period for the patient by bringing the family ‘to the bedside’ outside of traditional visiting hours through technologies such as Skype or email. And those conversations may well include patient progress updates thus reducing the strain on nurses who, at times, provide updates over the telephone. Additionally, family will be able to spend more time visiting patients while still being able to work remotely using free Wi-Fi.


3. Future Wearables

As the Internet of Things in healthcare becomes more commonplace we’re likely to see increasing examples of how wearable technology can be used to not only monitor patients in the home but in a clinical setting too. Tim Kelsey used the example of patients with diabetes, 1/5th of whom will have experienced an avoidable hypoglycaemic episode while in hospital. Using sensor technology connected to Wi-Fi will help minimise these incidents and ensure patients do not experience additional (and avoidable) complications during their stay in hospital. Again, the upside to the healthcare provider is a reduction in the cost of providing care.


4. Happier Patients

Talk to patients (young or old) that have spent an extended time in hospital and they will more often than not tell you that at times they felt a drop in morale due to having their regular routine significantly disrupted. By offering free Wi-Fi patients can use their own mobile devices to pull back and continue to enjoy some of those everyday activities that go a long way to making all of us happy. That might include watching a favourite TV programme, reading a daily newspaper or simply playing an online game. Being connected brings a sense of normality to what is undoubtedly a period of worry and concern, resulting in happier patients.


5. Reducing Readmissions

When we look at the team of people providing care for patients it’s easy to forget just how important family and friends are, albeit in a less formal way than clinicians. When it comes to reducing readmission my mind is drawn to the patient setting immediately after discharge from hospital where it’s likely that family and close friends will be primary carers when the patient returns home. I’m seeing a scenario whereby the patient and caregiver in a hospital connect to family members, using Skype via Wi-Fi for example, to talk through recovery and medication to help ease and increase the effectiveness of that transition from hospital to home. I believe this could have a significant impact on readmission rates in a very positive way.


Meeting Security Needs

Wi-Fi networks in a hospital setting will, of course, bring concerns around security, especially when we talk of accessing sensitive healthcare data. This should not stop progress though as there are innovative security safeguards created by Intel Security Group that can mitigate the risks associated with data transiting across both public and private cloud-based networks. And I envisage healthcare workers and patients will access separate Wi-Fi networks which offer enhanced levels of security to clinicians.


Vision to Reality

Currently there are more than 100 NHS hospitals providing Wi-Fi to patients, in some cases free and in others on a paid-for basis. What really needs to happen though to turn this vision of free Wi-Fi for all into a reality? There are obvious financial implications but I think there are great arguments for investment too, especially when you look at the clinical benefits and potential cost-savings. A robust and clear strategy for implementation and ongoing support will be vital to delivery and may well form part of the NHS feasibility study. I look forward to seeing the report and, hopefully, roll-out of free Wi-Fi across the NHS to provide an improved patient experience.


If you enjoyed this blog please drop your details here to receive our quarterly newsletter for more insight and comment on the latest Health IT issues.


Chris Gough is a lead solutions architect in the Intel Health & Life Sciences Group and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@CGoughPDX)

Check out his previous posts

Read more >

University Health Looks to Cut TCO for Its Epic and Caché Infrastructure by 40 Percent

By Steve Leibforth, Strategic Relationship Manager, Intel Corporation


How sustainable is your health IT environment? With all the demands you’re putting on your healthcare databases, is your infrastructure as reliable and affordable as it needs to be so you can stay ahead of the rising demand for services?


In Louisiana, IT leaders at one of the health systems we’ve been working with ran the numbers. Then, they migrated their InterSystems Caché database from their previous RISC platforms onto Dell servers based on the Intel® Xeon® processor E7. They tell us they couldn’t be happier—and they’re expecting the move to help them reduce TCO for their Epic EHR and Caché environment by more than 40 percent.


“Using Intel® and Dell hardware with Linux and VMware, you can provide a level of reliability that’s better than or equal to anything out there,” says Gregory Blanchard, executive director of IT at Shreveport-based University Health (UH) System. “You can do it more easily and at much lower cost. It’s going to make your life a lot easier. The benefits are so clear-cut, I would question how you could make the decision any differently.”


UH Photo.jpg


We recently completed a case study describing UH’s decision to migrate its Caché infrastructure. We talked with UH’s IT leaders about their previous pain points, the benefits they’re seeing from the move, and any advice they can share with their health IT peers. If your health system is focused on improving services while controlling costs, I think you’ll find it well worth a read. You’ll also learn about the Dell, Red Hat, Intel, and VMware for Epic (DRIVE) Center of Excellence—a great resource for UH and other organizations that want a smooth migration for their Epic and Caché deployments.  




UH is a great reminder that health IT innovation doesn’t just happen at the Cleveland Clinics and Kaiser Permanentes of the world. Louisiana has some of the saddest health statistics in the nation, and the leaders at UH know they need to think big if they’re going to change that picture. As a major medical resource for northwest Louisiana and the teaching hospital for the Louisiana State University Shreveport School of Medicine, UH is on the forefront of the state’s efforts to improve the health of its citizens. Its new infrastructure—with Intel Inside®—gives UH a scalable, affordable, and sustainable foundation. I’ll be excited to watch their progress.


Read the case study and tell  me what you think.

Read a whitepaper about scaling Epic workloads with the Intel® Xeon® processor E7 v3.

Join and participate in the Intel Health and Life Sciences Community

Follow us on Twitter: @IntelHealth, @IntelITCenter

Read more >

Amplify Your Value: Draw Your Own Map!

Amplify Your Value (5).png

The day was blistering hot! The air did not move. It was stifling hot. The crowd gathering in this Kansas field struggled to find shade, several people stood in the shadows of the tall grasses surrounding the field. Sweat poured off of me, even though I was standing still. I didn’t even want to fan myself because that would be too much exertion.

It was July 4th, 2004. We were standing in this field, nearing heat stroke, to commemorate the 200th anniversary of the Lewis and Clark Expedition passing through this area. (Yes, in addition to loving IT, I love history! I am SUCH a nerd!) Ok, I can hear you, “What does THIS have to do with amplifying your value, much less IT? I am pretty sure they didn’t have computers in 1804!” Bear with me for a few more paragraphs, dear IT explorer…

After standing through several speeches and re-enactments, we piled back into busses for the ride back to Atchison. Hey, at least on the bus, we could put the windows down and get a breeze…but we WERE packed in like sardines.

We poured our of the bus and headed straight to a local bar and grill for lunch, and a COLD ONE…or two…or three. And then, the lesson…there it was…posters everywhere…we had to try one…Boulevard Brewing Co., a sponsor of the Lewis & Clark Event…the slogan…”To those who make maps, not follow them”. Think about that….”To Those Who Make Maps, Not Follow Them”. That is the definition of explorer 200+ years ago, they would literally “step off the map”, going where no white man had ever gone before.

If you’ve been following our journey to Amplify Your Value, we have looked inward to see where we were; we looked forward to envision the future; we studied our business and identified the impacts it had on our organization; and, we decided we wanted to do value-add projects to the best of our abilities. Like Lewis and Clark, we were then stepping into the unknown. There was no map for where we were going. We were blazing trails.

It may seem odd to say in this series of Amplify Your Value but, what I am outlining here is NOT a roadmap for you to follow. What I am outlining here is what worked for us…it may not work for you. You have to follow the steps of introspection that we have discussed over the last several posts, but your roadmap will probably differ greatly from ours…that is OK and to be expected. Businesses are different, cultures are different, environments are different. The point here is, if you have followed the steps, you now know where you are and you now know where you are going.


Like Lewis and Clark 200 some years ago, we had a goal, we had an objective. To draw our map, we identified immediate steps we needed to take. Lewis and Clark needed specific skillsets, they need discipline, they needed teamwork and collaboration. We needed process, we needed education, we needed different skills, we needed a deeper understanding of our mission. We identified many of the steps we needed to take on our journey. Like good IT professionals, we identified dependencies and precursors to our journey. We laid out a five year plan.

Five years…that is eons in the IT world. Perhaps it was too much of a chunk to bite off from where we were. The first year or two were very specific. We had processes we wanted to implement, we had technologies we wanted to implement, and we had an ever evolving business that we needed to support and…lead. Like Lewis and Clark before us, we did not bind ourselves to specifics in a future we could not foresee. Had they not been open and flexible they never could have travelled to the Pacific Ocean and returned to “civilization”. We laid out specific steps in the near term and specific goals in the long term. Each year we review our plan and we adjust our steps, we do not adjust our strategy nor our vision.

I have to admit, some of the pieces of our journey fell into place…sometimes it is better be lucky than good…we were able to invest in new Disaster Recovery technology as a “big bang” because our prior investments all hit their depreciation at the same time. We were able to migrate our production environment as a “big bang” for pretty much the same reason. However, there were many times on our journey that we had to adjust, to adlib, to step off the map we had drawn. As you learn more, as you experience more, you need to be flexible and adjust your tactics to meet your objectives.

Your journey will not be the same as ours. Your company is different, your culture is different, technology is different. You have to be willing to step into the unknown, you have to be willing to draw your own map. You have to be willing to keep your focus on the mission and the destination; and adjust your plan to reach that destination. Make Your Own Map!

Next time we will explore our first step into the unknown. For us, that was paying attention to the “Google Whisperer”, for you, it may be paying attention to a different muse.

The series, “Amplify Your Value” explores our five year plan to move from an ad hoc reactionary IT department to a Value-add revenue generating partner. #AmplifyYourValue

We could not have made this journey without the support of several partners, including, but not limited to: Bluelock, Level 3 (TWTelecom), Lifeline Data Centers, Netfor, and CDW. (mentions of partner companies should be considered my personal endorsement based on our experience and on our projects and should NOT be considered an endorsement by my company or its affiliates).

Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.

Find him on LinkedIn.

Follow him on Twitter (@jtongici)

Add him to your circles on Google+

Check out more of his posts on Intel’s IT Peer Network

Read more from Jeff on Rivers of Thought

Read more >

High-Performance Computing Helping to Make Dreams Come True – HLRS

Audience-Watching-Movie-In-A-Theater.jpgThis year is shaping up to be one of the best years for cinema in a while. Some of Hollywood’s most iconic characters are returning to the big screen in 2015, with new releases from James Bond*, Star Wars* and The Hunger Games* franchises. However, few of us stop to wonder how 007 can plummet through a glass ceiling unscathed or how those X-wings crashing look so realistic… It’s all down to hidden technology.


Technology in the Talkies


Space drama, Gravity*, won a few Oscars at the 2014 Academy Awards, including Best Visual Effects, and it’s not hard to see why. Apparently, around 80 per cent of the scenes in Gravity were animated and computer generated. In many scenes, only Sandra Bullock and George Clooney’s faces existed as anything other than 1s and 0s on computers. Everything else, from space shuttles, jetpacks and space debris, was created by graphic artists using Dell* workstations powered by 4th generation Intel® Core™ i5 and i7 vPro™ processors.


Only last month, we released What Lives Inside, the fourth installment of Intel’s Inside Films series. Directed by two-time Oscar-winner Robert Stromberg, our latest social film stars Colin Hanks, J.K. Simmons and Catherine O’Hara alongside the recently launched Dell* Venue 8 7000 Series super-thin tablet with Intel® RealSenseTM technology and powered by an Intel® Atom™ processor Z3580. The film took eight days to shoot and relied on 200 visual effects artists, which just goes to show what it takes to bring such whimsical worlds to life on the big screen.


HPC Helping the Film Industry


3D movies rely a lot on technology and require significant computing capacity. In a change of pace from the usual manufacturing or university research projects, The High Performance Computing Center in Stuttgart (HLRS) recently supported a local production company by rendering a 3D kids’ movie called Maya, The Bee*. This 3D flick, starring Kodi Smit-McPhee and Miriam Margolyes, is not your typical HPC challenge, but the amount of data behind the 3D visuals presented quite a mountain to climb for the makers, Studio 100.


To ensure the film was ready in time, the project was transferred to HLRS, which had recently upgraded to the Intel® Xeon® Processor E5-2680 v3, enabling it to undertake more projects and better serve critical industry needs like this one because this system delivers four times the performance of its previous supercomputer. 1 Thanks to the HPC capacity available at HLRS, Maya, The Bee was released last month in all its 3D glory.2 “We are addicted to giving best possible service, so it is vital that we run on reliable technology,” said Bastian Koller, Manager of HLRS. For more on the HLRS supercomputer, click here.


Bringing Characters to Life


Intel and Framestore have been working together for almost five years now. However, Paddington* is the first film that Framestore has worked on with a computer-animated animal as the lead character, and the mischievous little bear caused quite a few challenges. Many characters brought to film, such as J. K. Rowling’s Dobby* or Andy Serkis as Gollum*, are shot using motion-capture technology to make them appear more lifelike, but the actor playing Paddington wore a head-mounted camera during the voice recordings so animators could see how a human face moved at every point and try to mimic it in bear form. While this gave audiences an incredible character, animating and rendering a photo-real lead character for every shot required significant processing capacity. It took 350 people across two continents working for three years to bring Paddington the CGI bear to life, but with high-performance, flexible IT solutions based on servers powered by Intel® Xeon® processors, it was a piece of cake (or a marmalade sandwich!).


Intel VP & GM, Gordon Graylish, shared his thoughts on the red carpet at the Paddington premiere, saying: “It is wonderful when [Intel® technology] gets to be used for creativity like this, and this is something that would have been impossible or prohibitively expensive [to make] even two or three years ago. Technology allows you to take the limits off your imagination.”


I’d love to hear what you think about all this hidden HPC tech used to get fantastic blockbusters into our movie theaters, so please leave a comment below or continue this conversation by connecting with me on LinkedIn or Twitter, with #ITCenter – I’m happy to answer questions you may have.


1 Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations, and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. For more information go to


2 Released in March 2015 in the US and the UK


*Other names and brands may be claimed as the property of others.

Read more >

The Challenge of the Smart Megacity

People-Talking-By-The-Water.pngOn my recent visit to China, I was struck by the country’s commitment to investing in smart cities. China’s most recent five-year plan set aside $70 billion for smart city technologies, with around 200 cities competing for funding. This is part of a huge project of urbanization, which saw $1 trillion allocated for urban infrastructure under the same plan. Last year, the Chinese government announced its intention to increase its urban population from 53.7 percent to 60 percent by 2020, and there are already 15 megacities in China with more than 10 million people.


With 1.3 million people per week moving to and trying to build lives in cities globally through 2050, it’s no surprise that the impetus to bring “smart” to these locations has risen on the agenda of many of the most prominent cities worldwide.


If done properly, a smarter city environment should have a measurable impact on the economy, the citizens and their lifestyles, business and the environment. There’s certainly no shortage of examples of how technology can be applied to making a city smart, as listed below:



BUT do applications like this amount to the “smart city”? One thing I’ve noticed, from my discussions with Intel customers in China and other countries, is there is no single definition of what the smart city is. Government bodies recognize the opportunities presented by technologies like those I’ve mentioned and it’s clear there’s a healthy degree of friendly competition amongst the cities but where I see many struggle is working out what they should do first or next, and what the smart city really means to them.


While this may be well understood by many of you, the focus areas we see coming up most frequently are:


  • Smart Transport/Mobility
  • Smart Home-Building-Facility
  • Smart Public Infra & Community Driven Services
  • Smart Fixed-Mobile Security Surveillance
  • Analytics and Big-Data Strategy-Planning



Irrespective of which area a city focuses on first, one thing is for sure: with the proliferation of millions of smart connected devices – on the transport network, and in anything from buildings to street lights to manholes – the result is a huge amount of flowing data. To get the best return on investment, it’s essential to plan how that data will be managed, how value can be extracted from it and what you plan to do with it. While almost every customer I talk to acknowledges they need to do something with the data most struggle with what they want to do with it. Without these plans in place, the data simply piles up and creates mountains in minutes. If you haven’t done so already I’d recommend hiring some Data Scientists – typically mathematicians or statisticians who can help you determine what data you need and what you might want to do with it.


On a somewhat related note, many of you will be familiar with SMAC stack (social, mobile, analytics, cloud). This is the digital platform being laid down across industries to underpin transformation. It’s been a core part of the rapid rise we’ve seen in shared economy companies like Uber and AirBnB. It is also fundamental to the smart city. The smart city is not just about adding connectivity to a building or other asset: it’s about the data you gather, the insights you gain, the services you can create and deliver, the accessibility you provide, the economic growth you stimulate and the communities you grow. Clearly, this all needs to be done and delivered in a secure and predictable manner. The point is not to look at or use SMAC and just look at one part of it. The impact comes from the multiplicative effect it has.


In the smart city, as much as anywhere these days, all roads lead to data. The question we need to be asking is: which roads do we want to travel?


What do you think defines the smart city? I’d be interested to read your comments below.


To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

Read more >

Ordering Kiosks Give Hardee’s a Tasty Solution for Satisfying Customers and Growing Sales

Today’s Hardees.jpgconsumers move at breakneck speedwhich is one reason quick-service restaurants like Hardee’s are so popular. And to stay out front, it’s essential for those restaurants to keep finding new ways to delight customers and keep them coming back. Hardee’s did it by installing new quick-service customer ordering kiosks with 24-inch, touch-based screens. Instead of waiting in line, customers can see enticing images of what’s on the menu and then order with a few quick taps on the screen. And since orders go directly to the kitchen, the food is ready sooner. It all means Hardee’s can serve more customers and bring in more revenue.


Based on Industry-Standard Technology


Hardee’s had investigated the idea of ordering kiosks a few years ago, but those available at the time were based on proprietary technology and too expensive to be practical.




The new kiosks Hardee’s chose are based on a Dell OptiPlex all-in-one system equipped with Intel® Core™ i5 vPro™ processors and Windows Pro 8.1.


Using Hardees_Tweet.jpgindustry-standard technology like Windows gives Hardee’s the flexibility to run other applications including software used by employees and managers. It’s also convenient for software developers, who can use familiar programming environments, and for the restaurant’s IT administrators, who can use existing Microsoft systems management tools.


The all-in-one form factor increases deployment flexibility, since Hardee’s can mount the kiosks in a variety of places, depending on the layout of each restaurant.


Controlling Costs in the Future


With the success of the kiosks, Hardee’s is now considering using all-in-one systems to gradually replace point-of-sale (POS) systems at the counter as a tasty solution for delivering an outstanding customer experience and controlling costs.


To learn more, take a look at the Hardee’s solution here or read more about it here. To explore more technology success stories, visit or follow us on Twitter.












Read more >

Big Data is Changing the Football Game

The football authorities have been slow to embrace technology, at times actively resisting it. It’s only been two seasons since some of Europe’s top leagues were authorized to use goal-line technology to answer the relatively simple question of whether or not a goal has been scored, i.e., has the whole ball crossed the goal line.


This is something the games of tennis and cricket have been doing for nearly ten years, but for one of the world’s richest sports, it risked becoming a bit of a joke.  As one seasoned British manager once said, after seeing officials deny his team a perfectly good goal: “We can put a man on the moon, time serves of 100 miles per hour at Wimbledon, yet we cannot place a couple of sensors in a net to show when a goal has been scored.” The authorities eventually relented, of course, their hand forced by increasingly common, high profile and embarrassing slip-ups.


But while the sport’s governing bodies were in the grips of technological inertia, the world’s top clubs have dived in head first in the last ten to fifteen years, turning to big data analytics in search of a new competitive advantage. In turn, this has seen some innovative companies spring up to serve this new ‘industry’, companies like Intelcustomer Scout7.


Taking the Guesswork out of the Beautiful Game


Big data has become important in football in part because it is big business. And for a trend that is only in its second decade, things have moved fast since the days of teams of hundreds of scouts collecting ‘data’ in the form of thousands of written reports in an effort to provide teams with insights into the opposition or potential new signings.


Now, with tools like Scout7’s football database, which is powered by a solution based on the Intel® Xeon® Processor E3 Family solution, they have a fast, sophisticated system that clubs can use to enhance their scouting and analysis operations.


For 138 clubs in 30 leagues, Scout7 makes videos of games from all over the world available for analysis within two hours of the final whistle[1]. At the touch of a button, they can take some of the guess work and ‘instinct’ out of deciding who gets on the pitch, as well as the legwork of keeping tabs on players and prospects from all over the world.



Pass master: Map of one player’s passes and average positions from the Italian Serie A during the 2014-15 season


Using big data analytics to enable smarter player recruitment is among Scout7’s specialties. For young players, without several seasons of experience on which to judge them, this can be especially crucial. How do you make a call on their temperament or readiness to make the step up? How will they handle the pressure? As we enter the busiest recruitment period of the football calendar – the summer transfer window – questions like this are being asked throughout the football world right now.


Delving into the Data


It’s a global game, and Scout7 deals in global data, so we can head to a league less travelled for an example: the Czech First League. The UEFA Under-21 European Championships also took place this summer and, with international tournaments often acting as shop windows for the summer transfer market (which opened on 1st July – a day after tournament’s final), it makes sense to factor this into our analysis.


So, let’s look at the Scout7 player database for players in the Czech First League that are currently Under-21 internationals, to see who has had the most game time and therefore exposure to the rigors of competitive football. We can see that a 22-year-old FC Hradec Králové defender, played every single minute of his team’s league campaign this season – 2,700 minutes in total.


Another player’s on-field time for this season was 97% — valuable experience for a youngster. Having identified two potential first-team ready players, Scout7’s database would allow us to take a closer look at the key moments from these games in high-definition video.


Check out our infographic, detailing a fledgling career of another player in the context of the vast amount of data collection and analysis that takes place within Scout7.


Scout-7-Player-Profile.pngScout7 player profile


“Our customers are embracing this transition to data-driven business decision-making, breaking away from blind faith in the hunches of individuals and pulling insights from the raft of new information sources, including video, to extract value and insights from big data,” explains Lee Jamison, managing director and founder, Scout7.


Scout7’s platform uses Intel® technology to deliver the computing power and video transcoding speed that clubs need to mine and analyze more than 3 million minutes of footage per year, and its database holds 135,000 active player records.


Lonely at the Top


There’s only room at the top of the elite level of sport for one and the margins between success and failure can be centimeters or split seconds. Identifying exactly where to find those winning centimeters and split seconds is where big data analytics really comes into its own.


Read the full case study.


To continue this conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

Find me on LinkedIn.

Keep up with me on Twitter.


*Other names and brands may be claimed as the property of others.


[1] Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. For more information go to

Intel does not control or audit the design or implementation of third party benchmark data or Web sites referenced in this document. Intel encourages all of its customers to visit the referenced Web sites or others where similar performance benchmark data are reported and confirm whether the referenced benchmark data are accurate and reflect performance of systems available for purchase.

Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. Check with your system manufacturer or retailer or learn more at

Read more >

10 Mobile BI Strategy Questions: Executive Sponsorship

Man-On-Morning-Comute-Using-Tablet.pngOf the ten mobile BI questions I outlined in my last post, “Do we have an executive sponsor?” is the most important one because the success of a mobile BI journey depends on it more than any other. While the role of an executive sponsor is critical in all tech projects, several aspects of mobile BI technology make it easy for executive management to be involved closely and play a unique role.


Moreover, although the CIO or the CTO plays a critical role in making sure the right technology is acquired or developed, the executive sponsorship from the business side provides the right level of partnership in order to run on all three cylinders of BI: insight into right data for the right role and at the right time.


Why Do We Need an Executive Sponsor?


We need executive sponsorship because, unlike grassroots efforts, business and technology projects require a top-down approach. Whether the strategy is developed as part of a structured project or as a standalone engagement, the executive sponsor delivers three critical ingredients:


  1. The mobile BI strategy is in line with the overall business strategy.
  2. The required resources are made available.
  3. Necessary guidance is provided in order to stay the course.


Is Having an Executive Sponsor Enough?


Having an executive only on paper isn’t enough, however. How much commitment an executive sponsor makes and the leadership he/she provides has a direct impact on the outcome of the strategy. Thus, the ideal executive sponsor of a mobile BI initiative is a champion of the cause, an ardent mobile user, and the most active consumer of its assets.


What Makes an Ideal Executive Sponsor for Mobile BI?


How does the executive champion the mobile BI initiative? First and foremost, he/she leads by example — no more printing paper copies of reports or dashboards. This means that the executive is keen not only to consume the data on mobile devices but also to apply the insight derived from these mobile assets to decisions that matter. Using the technology demonstrates firsthand the mobile mindset that sets an example for the rest of the direct reports and their teams. In addition, by recognizing the information available on these mobile BI assets as the single version of the truth, the executive provides a clear and consistent message for everyone to follow.


Is Mobile BI Easier to Adopt by Executive Sponsors?


Without a doubt, mobile BI, just like mobility, is conducive to a wide range of users, starting with executives. Unlike the PC, which wasn’t mobile at all, and the laptop, which provided limited mobility, tablets and smartphones provide a perfect combination of mobility and convenience. This ease of use makes these devices an ideal candidate in winning over even those executives who may have been initially uneasy to include mobile BI in their arsenals or to use them in their daily decision-making activities.


The mobility and simplicity may give the executives additional incentives to get involved in the development of requirements for the first set of mobile BI assets because they can easily see the benefits of having access to critical information at their fingertips. These benefits include an additional opportunity for sales and marketing to use mobile BI to showcase new products and services to customers (an approach that reflects the innovation inherent in the use of this technology).


Bottom Line: Executive Sponsorship Matters


The most important goal of a mobile BI strategy is to enable faster, better-informed decision making. Executive sponsorship matters because with the right sponsorship, the mobile BI initiative will have the best chance to drive growth and profitability. Without this sponsorship — even with the most advanced technology in place — a strategy will face an uphill battle.


What other aspects of executive sponsorship do you see playing a role in mobile BI strategy?


Stay tuned for my next blog in the Mobile BI Strategy series.


Connect with me on Twitter at @KaanTurnali and LinkedIn.


This story originally appeared on the SAP Analytics Blog.

Read more >