Recent Blog Posts

The Coding Illini Claim Victory in the Intel® 2014 Parallel Universe Computing Challenge

$26,000 awarded to National Center for Women and Information Technology charity

 

The Coding Illini, a team from NCSA and the University of Illinois at Urbana–Champaign, was declared the winner of the 2014 Intel® Parallel Universe Computing Challenge (PUCC) after a final competition that had plenty of excitement as both the Coding Illini and the Brilliant Dummies met their match with a tough coding round.

 

The final challenge was more substantial than prior matches and was the only one this year that used Fortran. The larger code was the undoing of both teams, as each made more changes than they were able to debug in their short ten minutes. The Coding Illini added to the drama when their final submission contained an error in their coding which appears to have broken the convergence of a key algorithm in the application. Their modified application continued iterating until long after the victor was declared and the crowds had dispersed. Co-host of the event, James Reinders, suspected both teams were only a few minutes away from success based on their progress and if either team had tried to do a little less they could have won easily by posting a programming result. The Coding Illini were declared the winner of the match based on the strength of their performance in the trivia round. Based on the Illini’s choice for a charitable organization, Intel will award the National Center for Women and Information Technology a donation of $26,000.

 

The Coding Illini, who were runners-up in the 2013 competition, celebrate the charitable award Intel will make to the National Center for Women and Information Technology on their behalf. The team includes team captain Mike Showerman, Andriy Kot, Omar Padron, Ana Gianaru, Phil Miller, and Simon Garcia de Gonzalo.

 

 

James later revealed that all the coding rounds were based on code featured in the new book High Performance Parallelism Pearls (specifically based on code from Chapters 5, 9, 19, 28, 8, 24 and 4, in that order. The original programs, effectively the solutions, are available from http://lotsofcores.com.) The competition problems were created by minimally changing the programs through the deletion of some pragmas, directives, and keywords associated with the parallel execution of the applications.

 

Complete Recap

 

This year’s PUCC at SC14 in New Orleans started with broad global participation with three U.S. teams, two teams each from Asia and Europe, and a Latin American team. In recognition of SC’s 26th anniversary, the teams were playing for a $26,000 prize to be awarded to the winning team’s selected charity.

 

On the opening night of the SC14 exhibition hall, last year’s winners, the Gaussian Elimination Squad from Germany who were playing for World Vision, eliminated their first round opponent, the Invincible Buckeyes from the Ohio Supercomputer Center and the Ohio State University who were playing for CARE. The German team had a slight lead after the first round that included SC conference and HPC trivia. Then their masterful performance in the coding round even amazed James Reinders, Intel’s Software Evangelist and the designer of the parallel coding challenge.

 

In the second match, The Brilliant Dummies from Korea selected Asia Injury Prevention Foundation as their charity. They faced off against the Linear Scalers from Argonne National Lab who chose Asha for Education. After round one, the Brilliant Dummies were in the lead with their quick and accurate answers to the trivia questions. Then in round two, the Seoul National University students managed to get the best Intel® Xeon™ and Intel® Xeon Phi™ performance with their changes to parallelize the code in the challenge. This performance cemented their lead and sent them on to the next round.

 

With the first two matches complete, the participants for the initial semi-final round were now identified. The Gaussian Elimination Squad would face The Brilliant Dummies.

 

Match number three, another preliminary round match, pitted Super Computación y Calculo Cientifico (SC3) representing four Latin American countries against the Coding Illini. The Coding Illini had reached the finals in the 2013 PUCC, and were aiming to improve their performance this year.  This was the first year for SC3, who chose to play for Forum for African Women Educationalists. In a tightly fought match, the Coding Illini came out on top.

 

In the final preliminary round match, Team Taiji representing four of the top universities in China chose Children and Youth Science Center, China Association for Science and Technology for their charity. They faced the EXAMEN representing the EXA2CT project in Europe and were playing for Room to Read. The team from China employed a rarely used strategy by fielding four different contestants in the trivia and coding rounds of the match and held the lead after the first round. Up until the very last seconds of the match it looked as though Taiji might be victorious. However, the EXAMEN submitted a MAKE at the very last second which improved the code performance significantly. That last second edit proved to be the deciding factor in the victory for the team from Europe.

 

So the Coding Illini would face the EXAMEN in the other semifinal round.

 

When the first semifinal match between the Gaussian Elimination Squad and The Brilliant Dummies started, the Germans were pretty confident. After all, they were the defending champions and had performed extraordinarily well in their first match. They built up a slight lead after the trivia round. When the coding round commenced, both teams struggled with what was a fairly difficult coding challenge that Reinders had selected for this match. As he had often reminded the teams, if they were not constrained by the 10 minute time limit, these parallel coding experts could have optimized the code to perform at the same or even better level than the original code had before Reinders deconstructed it for the purposes of the competition. As time ran out, The Brilliant Dummies managed to eke out slightly better performance and thus defeated the defending champions. The Brilliant Dummies would move on to the final round to face the winner of the EXAMEN/Coding Illini semi-final match.

 

In the other semifinal match, the Coding Illini took on the EXAMEN. At the end of the trivia round, the Coding Illini were in the lead. But as the parallel coding portion of the challenge kicked in, the EXAMEN looked to be the winner…until the Coding Illini submitted multiple MAKE commands at the last second to pull out a victory by just a small margin. They had used the same strategy on the EXAMEN that the EXAMEN had used in their match against Taiji. Coding Illini had once again made it to the final round and set up the final match with The Brilliant Dummies.

Read more >

Building a Truly Collaborative Enterprise

The benefits of having a highly collaborative enterprise is a given.  It’s not just the positive impact on business results like reduced time to market of products, their quality and improved customer satisfaction; the benefits also translates to better knowledge and people retention, workforce motivation and cohesiveness of the overall organization.  On the other hand, the challenges to ingrain the culture of collaboration within the organization are equally large, if not more.

 

The fundamental level of collaboration does happen in all enterprise.  People share content, files, e-mails, ideas, apps and whatever else is necessary to get the work done.  I call this collaboration by necessity.  This includes demonstration of collaborative behavior when ‘collaboration’ is mandated by the senior management.  Collaboration by choice is when people will proactively start on any task with a collaborative mindset in absence of any mandate, necessity or to fulfill any obligation. 

 

When thinking of creating a collaborative organization start with people centric approach instead of tools and technology.  Don’t be afraid to review and revamp the holy cows of annual performance reviews, rewards & recognitions and career promotions.  Identify the key areas where you would like to see more collaboration and remove any hurdles – process, workflow, budget and tools – that would be a hindrance.  Define a balanced scorecard that would give you an indication of the progress and not just motion.Collaboration.jpg

 

Tools and Technology


Ask any IT manager or a technologist about how to improve collaboration within the organization, they will come up with a list of tools and technologies that should be deployed that will guarantee improved collaboration.  A fancy looking dashboard will show how many groups have been created, documents and other content shared, comments posted, adoption rate and other indicators that, collectively, are expected to show how much of collaboration is happening within the organization.

 

As someone once said, “Do not confuse motion with progress.  A rocking horse keeps moving but does not make much progress”. The indicators and dashboard have to be developed that reflect the impact on business results.  Have we accelerated the design, development or some other process?  Has the day-1 quality of our product improved?  In order to track return on investment the dashboard has to include hard data that shows a clear and direct impact to business results; e.g. number of support calls dropped by 50% with new product launch compared to previous product launch.

 

Processes and Workflow

In most cases, when an organization selects tools and technologies for enabling collaboration they compare feature and functions.  In fact, I would go on a limb and say that there never is any mapping done to see if the selected tools will adapt to the processes and workflow of the organization.  It is usually assumed that management mandate, training and change management will encourage users to adapt to the tools instead of the other way around.

 

This assumption works only if the management is also willing to do away with the processes that are a hurdle to frictionless collaboration.  If the processes and workflow are not in sync with the tools, the extra burden of adapting to these tools will erode productivity of the workforce. Yes, there will be some productivity loss during the ramp up phase but in steady state, the collaboration tools and the organization processes should be in sync to be frictionless.

 

People and Incentives

While tools, technologies and processes enable or facilitate collaboration it is the people who actually collaborate.  Unfortunately, this fact usually comes as an after-thought to most of the organizational leaders.  On more than one occasion I have read and heard about the typical management chutzpah where they announce restructuring, cutbacks and layoffs on one hand as they ‘encourage’ the organization to become more collaborative and share knowledge on the other!

The other irony I see is that in most of the knowledge based industry, where collaboration is of paramount importance and can clearly create a differentiator, the incentives are stacked against it.  Individual performance is rewarded more than the team performance.  Deep expertise is touted more than collaborative results. Teams are scattered around the globe without any globalization strategy in place that is conducive to collaboration. Travel budgets are cut assuming that video conferencing can replace face-to-face highly interactive discussions and team building.  In short, the human and humane aspect is ignored with a faulty assumption that technology can bridge the gap.

Increasing collaboration within the organization is about culture shift, management & leadership and people empowerment supplemented with tools and technologies.  The strategy should be thought out at the highest possible level of the organization instead of driving it bottoms up.

This shift in mindset and behavior of the organization is complex and requires focused attention from the management.  It cannot happen overnight and, if ignored, will revert back to non-collaborative behavior very rapidly.

It can be done and rewards are all worth the effort!

 

Opinions expressed herein are my own and do not reflect that of my employer, Intel Corporation. My other posts can be read here and more about me is available on my website.

Read more >

What Is Mobile Business Intelligence?

What Is Mobile Business Intelligence?You might have heard this statistic by now: more people own a cell phone than a toothbrush.

 

In a Forbes post, Maribel Lopez lists a number of recent statistics about mobility. “While we could debate the numbers, the trend is clear,” she writes. ”The pace of mobile adoption across devices and applications is accelerating.” Mobility is no longer a nice-to-have option. Instead, it’s become a must for many businesses. Many surveys support this view. According to the Accenture CIO Mobility Survey 2013, “79% of respondents cited mobility as a revenue-generator and 84% said mobility would significantly improve customer interactions.”

 

The evolution of mobile BI

 

With this paradigm shift comes the natural extension of business intelligence (BI) to mobile business intelligence (mobile BI) or mobile intelligence. This term may mean different things to different people, and it’s sometimes used interchangeably, but your perception of mobile BI will be influenced primarily by your understanding of BI.

 

In my post “What Is Business Intelligence?” I defined BI as the framework that enables organizations of all sizes to make faster, better-informed business decisions. Mobile BI extends this definition and puts the emphasis on the application of mobile devices such as smartphones or tablet computers.

 

Therefore, you can argue that the fundamentals remain unchanged—Mobile BI is the enabler that, if designed, implemented, and executed effectively, can help organizations drive growth and profitability.

 

However, the way organizations go about realizing the true value of mobile BI may depend on the state of their enterprise mobility (for example, whether or not a formal mobile enterprise strategy and a road map exist) and the level of their BI maturity.

 

Harnessing the power of mobile BI

 

Mobile BI is more prevalent and more relevant today because the gap between the experience of traditional BI content consumed on a desktop PC and that accessed on a mobile device is disappearing rapidly. We now talk about the gap between a smartphone and a tablet device. The tablet devices are getting smaller both in size and weight to compete with our smartphones.

 

Rapid growth in areas such as the cloud, in-memory technology, big data, and predictive analytics are fueling this innovation cycle. As a result, companies are looking for ways to harness the power of mobile BI through innovation and without disruption.

 

As businesses face more obstacles and are forced to deal with more complex challenges, they increasingly require greater mobile access to more processed data coming from both structured sources (such as sales data by markets and geography), and unstructured sources (like social media or email data that can’t be easily queried with traditional tools and technologies).

 

Companies at the leading edge seek to gain the edge to exploit mobile BI to support a workforce that’s becoming more and more mobile.

 

Mobile BI can become a key differentiator

 

According to IDC, the “world’s mobile worker population will reach 1.3 billion, representing 37.2% of the total workforce by 2015.” The share of the mobile workforce is even higher if we focus on the business roles such as sales, where mobility is a critical component for success. Business models that rely on insight thru outdated or limited capabilities can no longer compete in an ever-increasing global market, which simply dictates mobile execution.

 

Today, there’s no doubt that both for-profit and not-for-profit organizations must deliver more for their customers and stakeholders. In this context, mobile BI can become a key differentiator in helping organizations cope with both the complexity and the real-time challenges they face with the execution of their strategy.

 

It’s a transformative force that has the power to change how businesses deliver value today, because mobile BI further breaks down the walls of information silos, thus dramatically extending the ability to gain actionable insight thru data-driven analyses for all decision makers at all levels of an organization. Where do you see Mobile BI adding value to your organization?

 

Connect with me on Twitter (@KaanTurnali) and LinkedIn.

 

This story originally appeared on The Decision Factor.

Read more >

Part 4 – Transforming the Workplace: An Integrated Strategy for Change

This is part 4 of my blog series about transforming the workplace. Be sure to read part 1, part 2, and part 3 in the series.


“If you can’t help people change, technology changing all around them won’t make the slightest difference.”

– Dave Coplin, Business Reimagined

 

It’s a common misconception in the business world that new technology equals change. This blog series has been exploring how the workplace is changing and the inevitable challenges of innovation. And while we know that technology is key to achieving transformation in the workplace, it’s only part of the story. Here I want to discuss the final component: applying an inclusive, integrated strategy to facilitate change throughout the organization with the right partnerships and culture change.


The need for a triumvirate approach: Culture, IT, and facilities

After the technology foundation is established, the rubber meets the road. The next step is putting the vision of workplace transformation into practice. To enable true transformation across the business, Intel recommends a triumvirate approach to address company culture, IT, and facilities.

JH.jpg  

Culture: Supporting change at every turn
A few companies are leading the pack when it comes to progressive culture. And why? It’s because they have embraced new styles of working from the top down. And in many cases, it involves playing games, supporting physical fitness, and so on. To facilitate change throughout your organization, it’s important to embrace the following key attributes:

  • Innovation
  • Velocity
  • Openness
  • Accountability

 

And a final note on the technology angle: One of the major challenges companies face is “tool fatigue.” If a new tool is brought in without an explanation of its value and an introduction, employees may forgo it as unnecessary and, ultimately, the project is seen as a failure. The missing link here is simply leadership and communication.

IT via the SMAC stack

There is consensus across the IT industry and analyst community that the social, mobile, analytics, cloud (SMAC) paradigm is the new platform for enabling the digital business. In the convergence of these four components, IT can change the way work gets done and ultimately drive transformation.

 

Social

Social computing provides a natural, intuitive way for people to communicate and collaborate by eliminating traditional communication hierarchies.

Mobile
Today, work is no longer a place that you go to; it’s what you do. Mobile computing is what makes this possible, with the ability to work anywhere, anytime, for greater business agility.

Analytics

Advanced analytics deliver insights at the point of decision to help speed decision making. Analytics can also enable a “Smart Advisor” to bring business-critical data to all employees.

Cloud

With shared IT systems in the cloud, employees can have access to the information they need anytime, on any device, from any location—including device and data synchronization.

 

Facilities innovation
Finally, to support new ways of working, you need the right work environment. It all boils down to achieving a level of harmony between the workplace and the work style so that there is alignment. This means that physical spaces should be places that employees actively want to engage in—versus feeling like they have to be there.

On one hand, facilities need to cater to the needs of the work group and collaborators, yet they must also serve those needing interruption-free environments for intensive tasks. Unfortunately, many offices today offer little to inspire people, poor collaboration facilities, and inefficient space utilization that ultimately impacts the bottom line.

It’s also interesting to consider how facilities and IT are set to come together. For example, a conference table in a meeting room today is just a table. Yet in the near future, it may be equipped with a touch-screen surface and Internet connectivity. Due to this inevitable crossover between facilities and IT, an ideal workplace transformation strategy requires those responsible for both facilities and IT to work together to realise the best environment.

 

Intel paves the way

In the next and final blog in this series, I’ll step through some examples of how Intel has implemented a triumvirate approach across its culture, IT, and facilities. And as previously mentioned, I’m currently working on a paper that will expand on Intel’s vision of workplace transformation that will be available soon.

How is your organization managing workplace transformation? Please join the conversation and share your thoughts. And be sure to click over to the Intel® IT Center to find resources on the latest IT topics.

 

Until the next time …

 

Jim Henrys, Principal Strategist

Read more >

Intel Adds New Dimension to SSDs

Imagine a fast and powerful 1 terabyte solid-state drive (SSD) that fits on your fingertip.

 

That’s enough storage capacity to hold more than 200,000 songs or more than 150 hours of high definition video! The day is coming when your tablet will have enough room to hold every song you can imagine, plus all your photos, videos and more. And it’s coming sooner than you think.

 

At Intel’s Investor Day yesterday, Rob Crooke, Intel vice president and general manager of the Non-Volatile Memory Solutions Group (NSG), unveiled Intel’s plans to begin production of 3D NAND for use in consumer and data center SSDs starting in the second half of 2015.

 

3D NAND is a sensational technological advancement allowing SSDs to store more data in less space, increase overall drive capacity, reduce power consumption and improve system-level performance at a lower cost to users. Intel achieves this by packing more storage density onto the SSD. It’s like taking a plot of land and building a high-rise apartment building as opposed to a single-family home. To show off the new 3D NAND, Rob presented from a computer featuring a prototype SSD utilizing the new technology.

 

Intel capitalized on its decades-long history of microchip manufacturing innovation to overcome the challenge of drilling 4 billion holes in a silicon chip. This means Intel is able to deliver unprecedented density at 256 Gbits per die, meaning we can deliver higher capacities at a lower cost. This enables us to continue to deliver on the promise of Moore’s Law by doubling storage capacity and enabling our CPUs to really show off their unique capabilities and tremendous performance. The potential 3D NAND brings to Intel SSDs is truly inspiring.

 

In data center applications having more storage closer to the CPU enables fast transactions, quick access to real-time data and short wait times for content. Intel’s 3D NAND delivers stunning performance and is very cost effective. Just one 4-inch server rack of Intel SSDs can deliver 11 million IOPS (input/output operations per second). For comparison, you would need a rack of hard disk drives measuring 500 feet tall to churn out the same performance. Beyond the savings in the cost of the drives, imagine the immense savings in power and cooling!

 

For consumers it means more storage where you need it: tablets and notebooks for photos, music and games; home theaters for hours of HD content delivered with almost no lag; and in vehicle infotainment systems to store maps, music and more. These benefits are just the tip of the iceberg.

 

Intel will continue its fruitful and long-term relationship with Micron and jointly held IM Flash Technologies (IMFT) to produce the new multi-level cell (MLC) flash chips with products available in the second half of 2015. For more information on Intel SSDs and non-volatile memory, visit http://www.intel.com/ssd.

 

Frank Ober

Read more of Frank’s SSD related posts

Read more >

Tech on Turkey Day: Simplifying and Adding Flair to Your Holiday Celebration

While the holidays are filled with fun, excitement and favorite traditions, they are also a hectic and sometimes stressful time of the year. Amidst shopping, cooking and entertaining, we often feel as though we might benefit from an extra helper … Read more >

The post Tech on Turkey Day: Simplifying and Adding Flair to Your Holiday Celebration appeared first on Technology@Intel.

Read more >

Both the President and Congress Have a Role in Fixing America’s Immigration System

By Peter Muller, director of Immigration Policy for Intel Fundamental reforms are needed to our Nation’s immigration laws for Intel to be able to hire enough talented people to support our advanced manufacturing and R&D operations in the United States. Ultimately, … Read more >

The post Both the President and Congress Have a Role in Fixing America’s Immigration System appeared first on Policy@Intel.

Read more >

The Finer Points Of Evaluating Battery Life

Our increasingly mobile lifestyles force us to rely heavily on our device’s batteries. We’re constantly seeking to get a little extra juice out of our laptops, phones, and tablets. Tablets, in particular, have become a prominent platform for both the home and office, and we rely on them to feature better battery life than many of our other devices. While some tablets boast 12+ hours of battery life, it’s important to understand that these devices are much more than just a battery — the rest of the device’s hardware specifications may have even more to do with battery life than the actual battery does.

 

For example, it’s a common misconception that so-called “power-efficient” processors may drain batteries slower, therefore giving you a device that can put in a full day of work. In many cases the opposite is true. Full-powered processors that perform computations quickly and efficiently can actually have less impact on a device’s battery by completing tasks and returning the device to a resting state faster.

GreaterBatteryLife_1.png

Battery life is also dependent on many factors beyond processing speed. While primarily a concern for laptops and 2-in-1 devices, connected peripherals like external hard drives and speakers may leach battery life from your device, lowering the probability that you’ll make it through the day without a charge. Other factors that determine your device’s battery life include your operating system, number of running programs, and whether or not you’re running an animated wallpaper.

 

Operating System & Battery Life

 

Some operating systems are optimized to work in conjunction with your device’s processor to optimize battery life. Google and Microsoft coordinate with chip makers in order to ensure tablet processors are designed with a specific mobile operating system in mind. Additionally, your operating system may have power-saving features that allow you to control display brightness and other settings to decrease power consumption.

 

Wallpapers & Background Processes

 

Some of the biggest battery killers hide behind the scenes. While animated wallpapers can be a fun way to personalize your device, enabling them on your tablet can drain your battery faster than you want. The animations represent a persistent task that your processor has to run, which lowers power efficiency.

 

In addition to your wallpaper, the number of apps running in the background can significantly affect your battery life. To keep them in check, consider quitting any applications not in immediate use in order to give your processor and battery a rest.

 

These are only a few factors that determine your device’s battery life. To learn more, read the blog Breaking Down Battery Life. You can also get a  comprehensive look at how your device distributes power, by checking out this white paper on evaluating battery life.

Read more >

Transforming Healthcare through Big Data

Frustration with electronic health record (EHR) systems notwithstanding, the data aggregation processes that have grown out of healthcare’s adoption of the electronic health record are now spawning analytical capabilities that were unthinkable just 15 years ago. By leveraging big data to track everything from patient recovery rates to hospital finances, healthcare organizations are capturing and storing data sets that are changing the way doctors, caregivers and payers tackle larger scale health issues.

 

It’s not just happening on the clinical side, either, where EHRs are extending real-time patient information to doctors and predictive analytics are helping physicians to better track and understand their patients’ medical conditions.

 

In Kentucky, for example, tech investments by the state’s largest provider systems are estimated at over $600 million, a number that doesn’t even reflect investments from two of the biggest local organizations, Baptist Health and University of Kentucky HealthCare. The data collected by these hospitals includes—and far exceeds—the EMR basics mandated under ARRA, according to an article in The Lane Report.

 

While the goal of improving quality of care is, of course, a key driver of such investments, so is the government mandate tying Medicare and Medicaid reimbursement to outcomes. According to a recent report from McKinsey & Company, more than 50 percent of doctors’ offices and almost 75 percent of hospitals nationwide are managing patient information electronically. So, it’s not surprising that big data is catching the attention of healthcare’s management teams.

 

By quantifying and analyzing an endless variety of metrics—including things like R&D, claims, costs, and insights gleaned from patients—the industry is refining its approach to both preventative care and treatment, and saving money in the process. A good example can be found in the analysis of data surrounding regression rates, which some hospitals are now using to stave off premature releases and, by extension, exorbitant penalties.

 

Others, such as Brigham and Women’s Hospital, already are applying algorithms to generate savings beyond readmissions, in areas that include: high-cost patients, triage, decompensation, adverse events, and treatment optimization.

 

While there’s room to debate the extent to which big data is improving patient outcomes—or the scope of savings attributable to big data initiatives given the associated system costs—the trend toward leveraging data for better outcomes and savings will only continue to grow as CIOs advance meaningful implementations of solutions, and major technology companies continue to expand the industry’s basket of options.

 

How is your healthcare organization applying big data to overcome challenges? Have the results proven worthwhile?

 

As a B2B journalist, John Farrell has covered healthcare IT since 1997 and is a sponsored correspondent for Intel Health & Life Sciences.

Read John’s other blog posts

Read more >

Intel Statement on President’s Executive Order on Immigration

Fundamental reforms are needed to our Nation’s immigration laws for Intel to be able to hire enough talented people to support our advanced manufacturing and R&D operations in the United States. Ultimately, this will require a legislative solution and we are … Read more >

The post Intel Statement on President’s Executive Order on Immigration appeared first on Policy@Intel.

Read more >

Recapping Intel Highlights at SAP TechEd 2014: Videos and Animations

SAP TechEd 2014 at Las Vegas was an exciting and enjoyable show, brimming with opportunities to learn about the latest innovations and advances in the SAP ecosystem. Intel had its own highlights, as I explain in this video overview of Intel’s key activities. These included the walk-on appearance of Shannon Poulin, vice president of Intel’s Data Center Group, during SAP President Steve Lucas’s executive keynote. Shannon did his best to upstage the shiny blue Ford Mustang that Steve gave away during the keynote, but that was a hard act to top. Curt Aubley, Intel Data Center Group’s vice president and CTO, took part in an executive summit with Nico Groh, SAP’s data center intelligence project owner, that addressed ongoing Intel and SAP engineering efforts to optimize SAP HANA* power and performance management on Intel® architecture.

 

I was at the conference filming man-on-the-street interviews with some of Intel’s visiting executives. I had a great conversation with Pauline Nist, general manager of Intel’s Enterprise Software Strategy, on the subject of Cloud: Public, Private, and Hybrid for the Enterprise, and the future of the in-memory data center. I also spoke to Curt Aubley about How Intel is Influencing the Ecosystem Data Center and how sensors and telemetry can provide real-time diagnostics on the health of your data center.

 

In the Intel booth, we also had the fun of launching our latest animation, Intel and SAP: The Perfect Team for Your Real-Time Business, a light-hearted look at the rich, long-standing alliance between SAP and Intel. In the video, the joint SAP HANA and Intel® Xeon® processor platform has the power of a space rocket—a bit of an exaggeration, perhaps. But SAP HANA is a mighty powerful in-memory database, designed from the ground up for Intel Xeon processors. Dozens of Intel engineers were involved in the development of SAP HANA, working directly with SAP to optimize SAP HANA for Intel architectures.

 

 

 

It’s not too late to catch some of the action from our booth! We filmed a number of our Intel Tech Talks, so click on these links to watch industry experts discussing the latest news and advances in the overlapping orbits of SAP and Intel.

 

 

Follow me at @TimIntel and search #TechTim to get the latest on analytics and data center news and trends.

Read more >

On the Ground at SC14: Fellow Traveler Companies

Let’s talk about Fellow travelers at SC14 – companies that Intel is committed to collaborating with in the HPC community. In addition to the end-user demos in the corporate booth, Intel took the opportunity to highlight a few more companies in the channel booth and on the Fellow Traveler tour.

 

Intel is hosting three different Fellow Traveler tours on Discovery, Innovation, and Vision. A tour guide leads a small group of SC14 attendees through the show floor to visit eight company booths (with a few call outs to additional Fellow Travelers along the way). Yes, you wear an audio headset to hear your tour guide. And yes, you follow a flag around the show floor. On our 30 minute journey around the floor, my Discovery tour visited (official stops are bolded):

  • Supermicro: Green/power efficient supercomputer installation at the San Diego Supercomputer Center
  • Cycle Computing: Simple and secure cloud HPC solutions
  • ACE Computers: ACE builds customized HPC solutions, and customers include scientific research/national labs/large enterprises. The company’s systems handle everything from chemistry to auto racing and are powered by the Intel Xeon processor E5 v3. Fun fact, the company’s CEO is working on the next EPEAT standard for servers.
  • Kitware: ParaView (co-developed by Los Alamos National Laboratory) is an open-source, multi-platform, extensible application designed for visualizing large data sets.
  • NAG: A non-profit working on numerical analysis theory, they also take on private customers and have worked with Intel for decades on tuning algorithms for modern architecture. NAG’s code library is an industry standard.
  • Colfax: Offering training for parallel programming (over 1,000 trained so far).
  • Iceotope: Liquid cooling experts, their solutions offer better performance/watt than liquid and air cooling hybrid.
  • Huawei: Offering servers, clusters (they’re Intel Cluster Ready certified) and Xeon Phi coprocessor solutions.
  • Obsidian Strategics: Showcasing a high-density Lustre installation.
  • AEON: Offering fast and tailored Lustre storage solutions in a variety of industries including research, scientific computing and entertainment; they are currently architecting a Lustre storage system for the San Diego Supercomputer Center.
  • NetApp: Their booth highlighted NetApp’s storage and data management solutions. A current real-world deployment includes 55PB of NetApp E-Series storage that provides over 1TB/sec to a Lustre file system.
  • Rave Computer: The company showcased the RT1251 flagship workstation, featuring dual Intel Xeon processor E5-2600 series with up to 36 cores and up to 90MB of combined cache. It can also make use of the Intel Xeon Phi co-processor for 3D modeling, visualization, simulation, CAD, CFD, numerical analytics, computational chemistry, computational finance, and digital content creation.
  • RAID Inc: Demo included a SAN for use in big data, running the Intel Enterprise Edition of Lustre with OpenZFS support. RAID’s systems accelerate time to results while lowering costs.
  • SGI: Showcased the SGI ICE X supercomputer, the sixth generation in the product line and the most powerful distributed memory system on the market today. It is powered by the Intel Xeon processor E5 v3 and includes warm water cooling technology.
  • NCAR: Is answering the question, how do you refactor an entire climate code. NCAR, in collaboration with the University of Colorado at Boulder is an Intel Parallel Computer Center aiming to develop tools and knowledge to help with the performance improvements of CESM, WRF, and MPAS on Intel Xeon and Intel Xeon Phi processors.


Intel Booth – Fellow Traveler Tours depart from the front right counter

 

After turning in my headset, I decided to check out the Intel Channel Pavilion next to Intel’s corporate booth. The Channel Pavilion has multiple kiosks (so many that they switched halfway through the show), each showcasing a demo with Intel Xeon and/or Xeon Phi processors, and highlighting a number of products and technologies. Here’s a quick rundown:

  • Aberdeen: Custom servers and storage featuring Intel Xeon processors
  • Acme Micro: Solutions utilizing the Intel Xeon processor and Intel SSD PCIe cards
  • Advanced Clustering Technologies: Clustered solutions in 2U of space
  • AIC: Alternative storage hierarchy to achieve high bandwidth and low latency via Intel Xeon processors
  • AMAX: Many core HPC solutions featuring Intel Xeon processor E5-2600 v3 and Intel Xeon Phi coprocessors
  • ASA Computers: Einstein@Home uses an Intel Xeon processor based server to search for weak astrophysical signals from spinning neutron stars
  • Atipa Technologies: Featuring servers, clustering solutions, workstations and parallel storage
  • Ciara: The Orion HF 620-G3 featuring the Intel Xeon processor E5-2600 v3
  • Colfax: Colfax Developer Training on efficient parallel programming for Xeon Phi coprocessors
  • Exxact Corporation: Accelerating simulation code up to 3X with custom Intel Xeon Phi coprocessor solutions
  • Koi Computers: Ultra Enterprise Class servers with the Intel Xeon processor E5-2600 v3 and a wide range of networking options
  • Nor-Tech: Featuring a range of HPC clusters/configurations and integrated with Intel, ANSYS, Dassault, Simula, NICE and Altair
  • One Stop Systems: The OSS 3U high density compute accelerator can utilize up to 16 Intel Xeon Phi coprocessors and connect to 1-4 servers

 

The Intel Channel Pavilion

 

Once completing the booth tours, I decided to head back to the Intel Parallel Computing Theater to listen to a few more presentations on how companies and organizations are putting these systems into action.

 

Joseph Lombardo, from the National Supercomputing Center for Energy and the Environment stopped by the theater to talk about the new data center they’ve recently put into action, as well as their use of a data center from Switch Communications. The NSCEE has a couple of challenges – massive computing needs (storage and compute power); time sensitive projects (those with governmental and environmental significance) and numerous and complex workloads. In their Alzheimer’s research, the NSCEE compares the genomes of Alzheimer’s patients with those of normal genomes. They worked with Altair and Intel on a system that reduces their runtime from 8 hours to 3 hours, while improving system manageability and extensibility.

 

Joseph Lombardo from the NSCEE

 

Then I listed in to Michael Klemm from Intel talking about offloading Python to the Intel Xeon Phi coprocessor. Python is a quick and high productivity language (packages include: iPython, Numpy/SciPy, and Pandas) that can help compose scientific applications. Michael talked through design principles for the pyMIC offload infrastructure: Simple usage, slim API, fast code and keep control in a programmer’s hand.

 

Michael Klemm from Intel

 

Wolfgang Gentzsch from UberCloud covered HPC for the Masses via cloud computing. Currently more than 90% of an engineer or scientist’s in-house HPC is completed via workstations and 5% via servers. Less than 1% is completed using HPC Clouds, which offers a ripe opportunity if challenges like security/privacy/trust, control of data (where and how is your data running), software licensing, and the transfer of heavy data can be resolved. There are some hefty benefits – pay per use, easily scaling resources up or down, low risk with a specific cloud provider – that may start to entice more users shortly. UberCloud has 19 providers and 50 products currently in their marketplace.

 

Wolfgang Gentzsch from UberCloud

 

The Large Hadron Collider is probably tops on my list of places to see before I die, so I was excited to see Niko Neufeld from LHCb CERN talk about their data acquisition/storage challenge. I know, yet another big data problem. But the LHC generates one petabyte of data EVERY DAY. Nikko talked through how they’re able to use some sophisticated filtering (via ASICS and FPGA) to get that down to storing 30PB a year, but that’s still an enormous challenge. The team at CERN is interested in looking at the Intel OmniPath Architecture to help them move data faster, and then integrating Intel Xeon + FPGA with Intel Xeon and Intel Xeon Phi processors to help them shave off the amount of data stored even more.

 

Niko Neufeld from LHCb CERN

 

And finally, the PUCC held matches 4 and 5 today, the last of the initial matches and the first of the playoffs. In the last regular match, Taji took on the Examen and, in a stunning last-second “make” run, the Examen took it by a score of 4763 to 2900. In the afternoon match, the Brilliant Dummies took on the Gaussian Elimination Squad (defending champs). It was a hard fought battle – for many of the questions both teams had answered before the multiple choice possibilities were shown to the audience. In the end, the Brilliant Dummies were able to eliminate the defending champions by a score of 5082 to 2082. Congratulations to the Brilliant Dummies, we’ll see you in the final on Thursday.

 

We’ll see the Brilliant Dummies in the PUCC finals on Thursday

Read more >

The Final Day for the 2014 Parallel Universe Computing Challenge @ SC14

Thursday, November 20, 2014

Dateline:  New Orleans, LA, USA

 

This morning at 11:00AM (Central time, New Orleans, LA), the second semi-final match of the 2014 Parallel Universe Computing Challenge will take place at the Intel Parallel Theater (Booth 1315) as the Coding Illini team from NCSA and UIUC, faces off against the EXAMEN from Europe.   Coding Illini earned its spot in is semi-final match by beating the team from Latin America (SC3), and the EXAMEN earned their semi-final slot by beating team Taiji from China.

 

The winner of this morning’s semi-final match will go on to play the Brilliant Dummies from Korea in the final competition match this afternoon at 1:30PM, live on stage from Intel’s Parallel Universe Theater.

 

The teams are playing for the grand prize of $26,000 to be donated to a charitable organization of their choice.

 

Don’t miss the excitement:

  • Match #5 is scheduled at 11:00AM
  • The Final Match is scheduled at 1:30PM

 

Packed crowd watching the PUCC

Read more >