Recent Blog Posts

Infrastructure and What Is Now Possible

“The new center of possibility” is the theme of Intel’s involvement in HP Discover 2015 in Las Vegas this year. What does that mean? As I look back at what technology has delivered over my career, the transformation is amazing. One of my early jobs was to scan documents and convert them to digital format. At the time, almost nothing was digital first. Today, why would anyone create a document that you cannot easily edit, when there is the possibility to transform and evolve the document simply and efficiently within a digital format? This is equally true for all businesses because of the changes in computing and the data center.

 

Over the past decade, the cost of computing has dropped 40 percent and the cost of storage has reduced 90 percent, while network capabilities have increased by a factor or 40. These changes, in conjunction with the innovation each of your businesses brings to the table, is driving the digital service economy.

 

Intel, with partners like HP, is creating a more efficient data center infrastructure, which requires innovation across every chip, platform and application. This is one of the key trends that is driving the data center of the future and in turn, driving the new center of possibility.

edimage.PNG

 

The journey to the Software-Defined Infrastructure

Last year HP and Intel unveiled the Apollo platform designed for High Performance Computing environments. Intel worked closely with HP to design a solution to solve many of its own internal challenges related to the platforms necessary for designing and manufacturing silicon. HP also announced the Helion platform, which aligns with work that Intel has fostered in Openstack for the past several years.

 

During HP Discover 2014, HP and Intel held discussions about what we could do to help accelerate the journey to the Software Defined Infrastructure (SDI) through our support of Openstack and other software efforts. This journey is focused on transitioning from a very fixed environment, which represents the data center of the past, to the much more flexible, secure and scalable environment that is the data center of the future.

 

The workloads are placed into a secure environment with policies and service levels that are managed automatically by intelligent software that learns about the workloads and adapts to business needs. The scale of systems today combined with changing market demands makes the SDI a critical end point for every organization. In the past, there were hundreds of systems to support organizations. Now there are thousands as the role of technology increases through automation, virtualization and coordination. This can become unmanageable without an SDI environment to control and maintain growth. 

 

SDI in action

As SDI systems mature, the possibilities expand. A key example is with the continuous growth in connected devices and the amount of data that is available. This data can be used to help target the right channels to increase business, as Intel has done by joining web data to sales systems data. This process determines customer interest and allows us to spend time with the customers most likely to adopt products with Intel Inside.

 

Harnessing and securing this expanded data to improve products and services is a key opportunity for many businesses to increase their value. We can also see the use of this data in other fields, such as healthcare, which is now utilizing genomic information to tailor treatments to individual needs, and can produce an analysis within days, whereas it previously could take weeks. With these new data sources and the decrease in process time, the possibilities are endless.

 

Intel at HP Discover

While I cannot tell you what announcements are going to be made, I can assure you that our partnership with HP on SDI, Big Data and analytics, and Platforms of the Future will continue to bring solutions that will help every business realize value from their new and innovative ideas.

 

One particular area that Intel is targeting is, by 2020 to get from genomic sequencing to individual treatment options all in a single day. Solutions like this will forever change the way we interact with technology by improving the way we work, live and play. Until then, I hope you have a great experience at the HP Discover 2015. Reach out to let me know what business platforms you think need to be unleashed to open up the new center of possibilities within your organization.

 

Don’t miss our Innovation Theater presentation on this topic:

 

The New Center of Possibility – Datacenters helping organizations excel in digital services economy

Session #: IT3911

Tuesday June 2, 11:00-11:30 am

Innovation Theater, Discover Zone

 

Dramatic improvements in compute, storage and network technology are converging with modern software and growth in connected devices. This is enabling new capabilities and connections among people, organizations and machines. Come hear from Intel about how Software-Defined Infrastructure will have a positive impact on IT’s ability to better manage the increasingly complex IT infrastructure and, more importantly, create new and exciting business opportunities.


Ed Goldman

CTO Enterprise Segment, Intel Corporation

Read more >

Change Your Desktops, Change Your Business. Part 2: Pay Less for Power

Efficiency. We talk about it all the time these days, but are we really doing everything we can to achieve it? In my first post in this PC refresh series, we discussed how updating your aging PCs to newer systems can help your employees become more efficient and productive.

 

But efficiency has another facet as well that we should consider, and that has to do with how much energy and money we are unwittingly wasting every month by using those same aging PCs. The truth is, if you’re using older desktop PCs, the watts and dollars can really add up. That can translate into big money you could be putting toward your business (I’ll get to how much in a second).

 

Intel-Desktops-Consume-Less-Power.pngAccording to the same report I cited in my previous post, both All-in-One PCs and Mini PCs consume significantly less power than your older desktop tower. In fact, and this is what really struck me, not only did they use less power than older systems, they used less power even when they were “under load” and the older systems were idle. How much less? Here’s what the study found:

 

  • The All-in-One PC consumed 55 percent fewer average watts while idle and 53 percent fewer average watts under load than did the legacy desktop tower.
  • The Mini consumed 60 percent fewer average watts while idle and 59 percent fewer average watts under load than did the legacy desktop tower.

 

Now, turn those savings into money, and it’s easy to see how new systems can prove a boon to your bottom line. Just consider this: The study reported that a business that replaces 10,000 legacy desktop towers with 10,000 new mini desktops can save $8.88 per employee every year. That adds up to a total cost savings of up to $355,200 in power costs.1 We might be talking mini desktops, but those aren’t mini savings. Something to think about.

 

There’s a lot more to come in this desktop series on PC refresh, so make sure to come back. Next up is Part 3, in which we’ll take a look at how new PCs can help make IT more effective. In the meantime, take a look at the complete version of the study cited above online, and feel free to join the conversation using #IntelDesktop.

 

This is the second and most recent installment of the “Change Your Desktops, Change Your Business” series in the Desktop World Tech Innovation Series. To view the other posts in the series, click here: Desktop World Series.

 

1. Based on a conservative estimate of one hour under load and seven hours idle for average power consumption per employee per day, 46-week work year per employee, and average U.S. commercial power costs of $0.1075 per kilowatt-hour from the U.S. Energy Information Administration (www.eia.gov/electricity/monthly/pdf/epm.pdf)

Read more >

Blueprint: Tips for Avoiding a Data Center Blizzard

We’re in the depth of winter and, yes, the snow can be delightful… until you have to move your car or walk a half block on icy streets. Inside the datacenter, the IT Wonderland might lack snowflakes but everyday activities are even more challenging year round. Instead of snowdrifts and ice, tech teams are faced with mountains of data.

 

So what are the datacenter equivalents of snowplows, shovels, and hard physical labor? The right management tools and strategies are essential for clearing data paths and allowing information to move freely and without disruption.

 

This winter, Intel gives a shout-out to the unsung datacenter heroes, and offers some advice about how to effectively avoid being buried under an avalanche of data. The latest tools and datacenter management methodologies can help technology teams overcome the hazardous conditions that might otherwise freeze up business processes.

 

Tip #1: Take Inventory

 

Just as the winter holiday season puts a strain on family budgets, the current economic conditions continue to put budget pressures on the datacenter. Expectations, however, remain high. Management expects to see costs go down while users want service improvements. IT and datacenter managers are being asked to do more with less.

 

The budget pressures make it important to fully assess and utilize the in-place datacenter management resources. IT can start with the foundational server and PDU hardware in the datacenter. Modern equipment vendors build in features that facilitate very cost-effective monitoring and management. For example, servers can be polled to gather real-time temperature and power consumption readings.

 

Middleware solutions are available to take care of collecting, aggregating, displaying, and logging this information, and when combined with a management dashboard can give datacenter managers insights into the energy and temperature patterns under various workloads.

 

Since the energy and temperature data is already available at the hardware level, introducing the right tools to leverage the information is a practical step that can pay for itself in the form of energy savings and the ability to spot problems such as temperature spikes so that proactive steps can be taken before equipment is damaged or services are interrupted.

 

Tip #2: Replace Worn-Out Equipment

 

While a snow shovel can last for years, datacenter resources are continually being enhanced, changed, and updated. IT needs tools that can allow them to keep up with requests and very efficiently deploy and configure software at a rapid pace.

 

Virtualization and cloud architectures, which evolved in response to the highly dynamic nature of the datacenter, have recently been applied to some of the most vital datacenter management tools. Traditional hardware keyboard, video, and mouse (KVM) solutions for remotely troubleshooting and supporting desktop systems are being replaced with all-software and virtualized KVM platforms. This means that datacenter managers can quickly resolve update issues and easily monitor software status across a large, dynamic infrastructure without having to continually manage and update KVM hardware.

 

Tip #3: Plan Ahead

 

It might not snow everyday, even in Alaska or Antarctica. In the datacenter, however, data grows everyday. A study by IDC, in fact, found that data is expected to double in size every two years, culminating in 44 zettabytes by 2020. An effective datacenter plan depends on accurate projections of data growth and the required server expansion for supporting that growth.

 

The same tools that were previously mentioned for monitoring and analyzing energy and temperature patterns in the datacenter can help IT and datacenter architects better understand workload trends. Besides providing insights about growth trends, the tools promote a holistic approach for lowering the overall power budget for the datacenter and enable datacenter teams to operate within defined energy budget limits. Since many large data centers already operate near the limits of the local utility companies, energy management has become mission critical for any fast-growing datacenter.

 

Tip #4: Stay Cool

 

Holiday shopping can be a budget buster, and the credit card bills can be quite a shock in January. In the datacenter, rising energy costs and green initiatives similarly strain energy budgets. Seasonal demands, which peak in both summer and the depths of winter, can mean more short-term outages and big storms that can force operations over to a disaster recovery site.

 

With the right energy management tools, datacenter and facilities teams can come together to maximize the overall energy efficiency for the datacenter and the environmental conditions solutions (humidity control, cooling, etc.). For example, holistic energy management solutions can identify ghost servers, those systems that are idle and yet still consuming power. Hot spots can be located and workloads shifted such that less cooling is required and equipment life extended. The average datacenter experiences between 15 to 20 percent savings on overall energy costs with the introduction of an energy management solution.

 

Tip #5: Reading the Signs of the Times

 

During a blizzard, the local authorities direct the snowplows, police, and rescue teams to keep everyone safe. Signs and flashing lights remind everyone of the rules. In the datacenter, the walls may not be plastered with the rules, but government regulations and compliance guidelines are woven into the vital day-to-day business processes.

 

Based on historical trends, regulations will continue to increase and datacenter managers should not expect any decrease in terms of required compliance-related efforts. Public awareness about energy resources and the related environment impact surrounding energy exploration and production also encourage regulators.

 

Fortunately, the energy management tools and approaches that help improve efficiencies and lower costs also enable overall visibility and historical logging that supports audits and other compliance-related activities.

 

When “politically correct” behavior and cost savings go hand in hand, momentum builds quickly. This effect is both driving demand for and promoting great advances in energy management technology, which bodes well for datacenter managers since positive results always depend on having the right tools. And when it comes to IT Wonderlands, energy management can be the equivalent of the whole tool-shed.

Read more >

Mobile Networks and Internet Swamped by Exploding Video

By Bill Rollender, AM PLM, Media, & CPU Technology, Intel

 

 

Everyone knows video streaming is popular, but who was expecting it to take up the lion’s share of network bandwidth? Cisco* reports mobile video traffic exceeded 50 percent of total mobile data traffic for the first time in 2012 and forecast it will increase to three-fourths by 2019.1 On the Internet side, Cisco expects IP video traffic will be 79 percent of all global consumer Internet traffic in 2018, up from 66 percent in 2013.2

 

npg-pic.jpg

 

Service provider dilemma

 

As video continues to be a major consumer of network bandwidth, due to the popularity of social media and mobile devices, service providers really need to come up with more effective strategies to handle additional media traffic without breaking the bank.

 

“Service providers have to drive down both the capital and operating costs with video delivery—without sacrificing quality, reliability, or scale,” said Robert Courteau, Executive V.P., Communications BU, Kontron*.3

 

Equipment manufacturer challenge

 

Exploding video traffic is placing unprecedented demands on equipment manufacturers to increase workload density and throughput of media servers. In general, they need new ways to increase performance while keeping power consumption in check.

 

Plus, media servers must help service providers balance user demand with operational factors, such as power, bandwidth, advanced traffic control, differing standards, and quality of service. From the core of the network to its edges, service providers are demanding equipment that satisfies their needs for growth, quality delivery, and diversified services.

 

Optimized content delivery platform

 

With mobile device battery life top of mind for users, the need for efficient video transcoding in the cloud has never been greater. Working with Intel, Kontron has created the optimal solution that streams content in formats suited to mobile devices while addressing the issues of energy efficiency, scalability, and cost for service providers.

 

The Kontron* SYMKLOUD* platform features up to 18 Intel® Core™ i7 processors with integrated Intel® HD Graphics 4000 that transcode video streams without using the CPU cycles, so there’s plenty of headroom remaining for other applications, like video analytics. It also supports OpenFlow* for software-defined networking (SDN) and network functions virtualization (NFV) deployments.

 

According to Kontron, the Intel processor is the best-in-class solution for media optimization applications. This means smoother visual quality, spectacular HD media playback, and improved ability to decode and transcode simultaneous video streams.4

 

Higher throughput now available

 

Intel recently launched two Intel® Xeon® processors expressly designed to deliver an exceptionally large number of video transcoding channels per watt for demanding media processing applications. The Intel® Xeon® processor E3-1278L v4 and the Intel® Xeon® processor E3-1258L v4 integrate Intel® Iris™ graphics (i.e., on-processor graphics) to help minimize the CapEx and OpEx of equipment executing media applications.

 

Since the processor graphics is on-chip, it consumes less power than an add-in graphics card and delivers four to five times more media acceleration than software-only media processing.5

 

The 5th generation Intel Xeon processor E3-1278L v4 increases the number of H.264 transcoded streams from 12 to 18 for about a 50 percent improvement over 4th generation Intel® Core™ processor-based designs in the same thermal envelope.6

 

Transcoding Reduces Video Traffic

 

The growing popularity of video streaming services, such as YouTube*, Hulu*, and Netflix*, and the proliferation of 4K high-definition content concerns many service providers. But they can reduce network bandwidth requirements for video content with media servers based on Intel® architecture that enable a range of low-power, high-density, and scalable solutions.

 

 

 

 

1 Source: “Cisco Visual Networking Index: Global Mobile Data Traffic Forecast Update, 2014–2019,” February 3, 2015, http://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white_paper_c11-520862.html

 

2 Source: “Cisco Visual Networking Index: Forecast and Methodology, 2013–2018,” June 10, 2014, http://www.cisco.com/c/en/us/solutions/collateral/service-provider/ip-ngn-ip-next-generation-network/white_paper_c11-481360.html

 

3 Source: “Kontron and Genband to Showcase HD Video Delivery Reference Solution for Service Provider NFV Environments,” February 25, 2015, http://www.kontron.com/about-kontron/news-events/detail/genband.

 

4 Source: “Kontron Launches the Symkloud MS2900 Media Platform for Clloud Trasncoding,” February 25, 2013, http://www.kontron.com/about-kontron/news-events/detail/kontron-launches-the-symkloud-ms2900-media-platform-for-cloud-transcoding.

 

5 Source: AnandTech, “Intel Iris Pro 5200 Graphics Review: Core i7-4950HQ Tested,” June 1, 2013, http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/18.

 

6 Source: Intel testing.

Read more >

Quanta Cloud Technology Expands its Product Portfolio with Intel Xeon Processor E3-1200 v4 Product Family

By Mike Yang General Manager of QCT (Quanta Cloud Technology)

 

quanta-image.png

 

In my last blog, I talked about how the pursuit of greater efficiency is a critical tenet of business survival for any company that operates datacenters. These efficiencies are measured by metrics like total operational cost, dollar per watt, and total cost of ownership (TCO).

 

At QCT (Quanta Cloud Technology), we have been extremely successful in the hyperscale datacenter space by delivering high performance and high efficiency systems equipped with dual-socket Intel® Xeon® processor E5-2600 or quad-socket Intel® Xeon® processor E7-4800. We launched more than dozen of new server series supporting Intel Xeon processor E5-2600 v3 product family that greatly improve efficiency for our customers.

 

That’s why we’re happy to announce today that we have become a time-to-market partners on the latest Intel® Xeon® processor E3-1200 v4 product family.

 

The Intel Xeon processor E3 is targeted at low-end servers and microservers, which are an emerging category of dense servers for web hosting and cloud implementations. Microservers usually have lower-power processors and are designed to handle large volumes of lightweight web or cloud transactions, like search queries and social networking page renderings. QCT offers a range of server platforms, each designed to meet different workloads perfectly.

 

In addition to the hosting applications noted above, the latest Intel Xeon processor E3-1200 v4 product family is the first generation processor focused on media/graphic workloads. This is an emerging datacenter application, based on feedback from our customers.

 

Media service providers all face the same challenge: streaming ever-increasing volumes of content to a rapidly growing global market connected with billions of mobile devices. These customers need transcode solutions with cost-efficient, dense designs that can deliver high-definition video to an array of devices and mobile operating systems.

 

A datacenter graphics server based on the Intel Xeon processor E3-1200 v4 product family does just that. The new processor can support more transcoding jobs per node when compared with discrete graphics.

 

This is big news, made possible by Intel. Now you can support more concurrent media transcoding functions in parallel, lowering your total cost of ownership while enabling a better experience for user-generated media, on-demand viewing, live broadcasting or videoconferencing. Whether you host desktops and workstations remotely or deliver video in the cloud, the graphics performance of the Intel Xeon processor E3-1200 v4 product family can provide the rich visual experiences end users seek. At the same time, these customers will benefit from greater energy efficiency.

 

But the potential is bigger than optimal efficiency for video transcoding and streaming. I can envision using the Intel Xeon processor E3-1200 v4 product family for big data analytics. These new processors deliver great computing power to capture valuable metrics, gain insights, and perform data-intensive tasks like video search indexing, digital surveillance, and automated ads that react to user behavior.

 

We at QCT will be working closely—and quickly—with Intel and the Intel Xeon processor E3-1200 v4 product family to help customers build datacenters that are reliable and efficient, so customers can focus on core business growth and innovation of new software products.

 

The revolution is here. Let’s see how Intel and QCT transform the future datacenter together.

Read more >

Meet the New Data Center Graphics Powerhouse

Data centers everywhere are dealing with a flood of video traffic. This deluge is only going to grow in scope in the years to come. Consider numbers like these: Online video streaming viewership jumped by 60 percent in 2014 alone,  and video delivery has now become the number one source of Internet traffic   and by 2018, video traffic is set to comprise 80 percent of the Internet’s traffic2.

 

And it’s not just YouTube videos and Netflix streaming that are causing problems for data center operators. Many organizations are also dealing with the demands of complex 3D design applications and massive data sets that are delivered from secure data centers and used by design teams scattered around the world.

 

To keep pace with current and future growth in these graphics-intensive workloads, data center operators are looking to optimize their data center computing solutions specifically to handle an ever-growing influx of graphics-intensive traffic.

 

That’s the idea behind the new Intel® Xeon® processor E3-1200 v4 family with integrated Intel® Iris™ Pro graphics P6300 — Intel’s most advanced graphics platform. This next-generation processor, unveiled today at the Computex 2015 conference, also features Intel® Quick Sync Video which accelerates portions of video transcoding software by running it hardware.

 

This makes the Intel Xeon processor E3-1200 v4 family an ideal solution for streaming high volumes of HD video. It offers up to 1.4 times the transcoding performance of the Intel® Xeon® processor E3-1200 v3 family and can handle up to 4300 simultaneous HD video streams per rack.

 

The new Intel Xeon Processor E3-1200 v4 family is also a great processing and graphics solution for organizations that need to deliver complex 3D applications and large datasets to remote workstations. It supports up to 1.8 times the 3D graphics performance  versus the previous generation Intel Xeon processor E3 v3 family.

 

I’m pleased to say that the new platform already has a lot of momentum with our OEM partners. Companies designing systems around the Intel Xeon Processor E3-1200 v4 family include Cisco, HP, Kontron, Servers Direct, Supermicro, and QCT (Quanta Cloud Technology).

 

Early adopters of the Iris Pro graphics-enabled solution include iStreamPlanet, which streams live video to a wide range of user devices via its cloud delivery platform. In fact, they just announced a new 1080p/60 fps service offering:

 

“We’re excited to be among the first to take advantage of Intel’s new Xeon processors with integrated graphics that provide the transcode power to drive higher levels of live video quality, up to 1080p/60 fps, with price to performance gains that allow us to reach an even broader market.” — Mio Babic, CEO, iStreamPlanet

 

The Intel Xeon processor E3-1200 v4 product family also includes Intel® Graphics Virtualization Technology for direct assignment (Intel® GVT-d). Intel GVT-d directly assigns a processor’s capabilities to a single user to improve the quality of remote desktop applications.

 

Looking ahead, the future is certain to bring an ever-growing flood of video traffic, along with ever-larger 3D design files. That’s going to make technologies like the Intel Xeon processor E3-1200 v4 family and Iris Pro graphics P6300 all the more essential.

 

For a closer look at this new data center graphics powerhouse, visit intel.com/XeonE3

 

 

 

[1] WSJ: “TV Viewing Slips as Streaming Booms, Nielsen Report Shows.” Dec. 3, 2014.

[2] Sandvine report. 2014.

[3] Measured 1080p30 20MB streams: E301286L v3=10, E3-1285L v4=14.

[4] Measured 3DMark® 11: E301286L v3=10, E3-1285L v4=14.

Read more >

The Conference Room Just Got Smarter: Introducing Intel® Unite™

unite_rgb_3000.pngNew solution by Intel designed to modernize existing meeting spaces and offer a better way to work


Conference rooms are intended to encourage and facilitate collaboration, but how many times have you been in a meeting where someone wasn’t able to connect their laptop to a projector because the correct type of cables were missing, or attendees participating outside the room couldn’t see what was being shared? The traditional conference room setup can be complex and can impede teamwork and productivity; that is the antithesis of collaboration.

 

The workplace is changing and businesses need to transform to foster innovation and also stay competitive. Intel is leading the way with technologies ranging from new wireless capabilities to business-ready form factors to leading-edge security to help organizations modernize and develop a better way to work. As part of our continued vision to help organizations transform their workplace, announced today, Intel is now offering a product enabling a cost-effective and easy way for employees to collaborate and spend more time getting work done and less time fumbling with wires and equipment. 

 

Unite for a Better Way to Work

 

With the combination of select Intel® Core™ vPro™ processor-powered mini PCs and the Intel® Unite™ software product, collaboration and meeting productivity is taken to the next level, transforming existing conference rooms into smart and connected meeting spaces with enhanced security.

 

The Intel Unite solution offers virtually seamless collaboration from any location, and fast, simple meeting starts. Instead of having to rely on adapters or dongles, this new solution uses the existing wireless networks within the business to connect PCs to existing displays, projectors or interactive whiteboards via an Intel Core vPro processor-based mini PC with the Intel Unite software.

 

Peer-to-peer sharing capabilities allow workers to collaborate outside the meeting room. Everyone, regardless of where they are in the world, can interact with the content and annotate with the image in real time, and then quickly and easily share files.1

 

 

 

Enhanced Security and Improved IT Manageability

 

With the Intel Unite solution, meeting organizers share a unique, rotating PIN with users to allow them to join a session quickly, with security in mind. The PIN helps to identify proximity and owner, so organizers can determine who they will or won’t allow onto their corporate network. The data is encrypted with 256-bit Secure Sockets Layers (SSL) ensuring that it remains in the conference network. The solution stays within the virtual walls of your business and is not dependent on additional vendor solutions.1

 

Making the Intel Unite solution easy to deploy in corporate environments is key to ease of use, not only by end users but also by IT departments. Intel Unite software is designed to slot into existing IT client deployment models and client software distribution methods. This design offers convenience to IT from a deployment and management standpoint. With Intel vPro technology built into the mini PC, IT can also control, patch and repair the device, even when the system is offline.

 

The Smarter Conference Room Is Available Today.

 

The Intel Unite solution easily works with most existing setups and will transform traditional conference rooms into modern meeting spaces affordably and without a massive overhaul. There’s no need to buy or install a whole new infrastructure when the Intel Unite solution offers interoperability with popular platforms already in the market. IT shops with Intel vPro technology can use many of the existing management tools, systems and policies already in place, decreasing the total cost of ownership.

 

Intel Unite was designed to bring together existing conferencing solutions, both hardware and software, into a simple and intuitive user experience. With a built-in extensibility plug-in design, IT can connect existing conferencing solutions, room controls and more. Intel intends to publish reference plug-ins to help customers see the benefits and possibilities with usages like guest access, Cisco* Telepresence integration and lighting controls. Intel Unite supports multiple operating systems, including Windows* and Mac OS X*, with other operating systems to be added in the future.

 

Better Conference Rooms, Better Ideas, A Better Way To Work

 

With Intel Unite, we have strived to modernize the conference room into a smart and connected meeting space with enhanced security. Technology should never be an artificial barrier to human creativity so we’ve worked to create a collaborative platform that enables rather than disables. Intel Unite is an innovative solution that we believe will help businesses improve how work gets done.

 

Intel is working with a number of OEMs (including ASUS, Dell, Fujitsu, HP, and Lenovo) as well as customers and partners on the Intel Unite solution offering.

 

Please visit www.intel.com/unite for more information and updates. To continue this conversation on Twitter, please use #WorkingBetter.

 

Additional resources:

 

 

LEGAL DISCLAIMERS & COPYRIGHTS 

 

1 Intel technologies may require enabled hardware, specific software, or services activation.  Performance varies depending on system configuration. No computer system can be absolutely secure. Check with your system manufacturer or retailer or learn more at www.intel.com

 

Copyright © 2015 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Core, Intel vPro, and Intel Unite are trademarks of Intel Corporation in the U.S. and/or other countries.

 

*Other names and brands may be claimed as the property of others.

Read more >

Optimizing Media Delivery in the Cloud

For cloud, media, and communications service providers, video delivery is now an essential service offering—and a rather challenging proposition.

 

In a world with a proliferation of viewing devices—from TVs to laptops to smart phones—video delivery becomes much more complex. To successfully deliver high-quality content to end users, service providers must find ways to quickly and efficiently transcode video from one compressed format to another. To add another wrinkle, many service providers now want to move transcoding to the cloud, to capitalize on cloud economics.

 

That’s the idea behind innovative Intel technology-based solutions showcased at the recent Streaming Media East conference in New York. Event participants had the opportunity to gain a close-up look at the advantages of deploying virtualized transcoding workflows in private or public clouds, with the processing work handled by Intel® architecture.

 

I had the good fortune to join iStreamPlanet for a presentation that explained how cloud workflows can be used to ingest, transcode, protect, package, stream, and analyze media on-demand or live to multiscreen devices. We showed how these cloud-based services can help communications providers and large media companies simplify equipment design and reduce development costs, while gaining the easy scalability of a cloud-based solution.

 

iStreamPlanet offers cloud-based video-workflow products and services for live event and linear streaming channels. With its Aventus cloud- and software-based live video streaming solution, the company is breaking new ground in the business of live streaming. Organizations that are capitalizing on iStreamPlanet technology include companies like NBC Sports Group as well as other premium content owners, aggregators, and distributors.

 

In the Intel booth Vantrix showcased a software-defined solution that enables service providers to spread the work of video transcoding across many systems to make everything go a lot faster. With the company’s solution, transcoding workloads that might otherwise take up to an hour to run can potentially be run in just seconds.

 

While they meet different needs, solutions from iStreamPlanet and Vantrix share a common foundation: the Intel® Xeon® processor E3-1200 product family with integrated graphics processing capabilities. By making graphics a core part of the processor, Intel is able to deliver a dense, cost-effective solution that is ideal for video transcoding, cloud-based or otherwise.

 

The Intel Xeon processor E3-1200 product family supports Intel® Quick Sync Video technology. This groundbreaking technology enables hardware-accelerated transcoding to deliver better performance than transcoding on the CPU—all without sacrificing quality.

 

Want to make this story even better? To get a transcoding solution up and running quickly, organizations can use the Intel® Media Server Studio, which provides development tools and libraries for developing, debugging, and deploying media solutions on Intel-based servers.

 

With offerings like Intel Media Server Studio and Intel Quick Sync Video Technology, Intel is enabling a broad ecosystem that is developing innovative solutions that deliver video faster, while capitalizing on the cost advantages of cloud economics.

 

For a closer look at the Intel Xeon processor E3-1200 product family with integrated graphics, visit www.intel.com/XeonE3.

Read more >

Part I: Data-Driven Science and the Coming Era of Petascale Genomics

Seventeen years. That’s how long it has taken us to move from the dawn of automated DNA sequencing to the data tsunami that defines next-generation sequencing (NGS) and genomic analysis in general today. I’m remembering, with some fondness, the year 1998 which I’ll consider as the year the life sciences got serious about automated DNA sequencing, about sequencing the human genome in particular, and the year the train left the station and the genomics research went from the benchtop to prime mover of high-performance computing (HPC) architectures and never looked back.

 

1998 was the year Perkin Elmer formed PE Biosystems, an amalgam of Applied Biosystems, PerSeptive Biosystems, Tropix, and PE Informatics, among other acquisitions. That was the year PE decided they could sequence the human genome before the academics could – that is, by competing against their own customers, and they would do it by brute force application of automated sequencing technologies. That was the year Celera Genomics was born and Craig Venter became a household name. At least if you lived in a household where molecular biology was a common dinnertime subject.

 

Remember Zip Drives?

In 1998, PE partnered with Hitachi to produce the ABI “PRISM” 3700, and hundreds of these machines were sold worldwide, kick starting the age of genomics. PE Biosystems revenues that year were nearly a billion dollars. The 3700 was such a revolutionary product that it purportedly could produce the same amount of DNA data in a single day what the typical academic lab could produce in a whole year. And yet, from an IT perspective, the 3700 was quite primitive. The computational engine driving the instrument was a Mac Centris, later upgraded to a Quadra, then finally to a Dell running Windows NT. There was no provision for data collection other than local storage, which if you wanted any portability was at that time the ubiquitous Iomega Zip Drive. You remember those? Those little purplish-blue boxes that sat on top of your computer and gave you a whopping 100 megabytes of portable storage. The pictures on my phone would easily fill several Zip disks today.

 

Networking the 3700 was no mean feat either. We had networking in 1998 of course; gigabit Ethernet and most wireless networking technologies were still just an idea in 1998 but 100 megabit (100Base-TX) connections were common enough and just about anyone in any academic research setting had a least 10 megabit (10Base-T) connections available. The problem was the 3700, and specifically the little Dell PC that was paired with the instrument and responsible for all the data collection and subsequent transfer of data to some computational facility (Beowulf-style Linux HPC clusters were just becoming commonplace in 1998 as well.)  As shipped from PE at that time, there was zero provision for networking, zero provision for data management beyond the local hard drive and/or the Zip Drive.

 

It seems laughable today but PE did not consider storage and networking, i.e., the collection and transmission of NGS data, a strategic platform element. I guess it didn’t matter since they were making a BILLION DOLLARS selling 3700s and all those reagents, even if a local hard drive and sneakernet were your only realistic data management options. Maybe they just didn’t have the proper expertise at that time.  After all, PE was in the business of selling laboratory instruments, not computers, storage, or networking infrastructure.

 

Changing Times

How times have changed. NGS workflows today practically demand HPC-style computational and data management architectures. The capillary electrophoresis sequencing technology in the 3700 was long-ago superseded by newer and more advanced sequencing technologies, dramatically increasing the data output of these instruments and simultaneously lowering the costs as well.  It is not uncommon today for DNA sequencing centers to output many terabytes of sequencing data every day from each machine, and there can be dozens of machines all running concurrently. To be a major NGS center meant also being adept at collecting, storing, transmitting, managing, and ultimately archiving petascale amounts of data. That’s seven orders of magnitude removed from the Zip Drive. If you are also in the business of genomics analysis that meant you needed to be experts in computational systems capable of handling data and data rates at these scales as well.

 

Today, this means either massively scalable cloud-based genomics platforms or the more traditional and even higher scale HPC architectures that dominate all large research computing centers worldwide. We are far, far beyond the days of any single Mac Quadra or Dell server. Maybe if PE had been paying closer attention to IT side of the NGS equation they would still be making billions of dollars today.

 

In Part II of this blog, I’ll look at what’s in store for the next 17 years in genomics. Watch for the post next week.

 

James-Reaney_avatar_1430432638-80x80.jpg

James Reaney is Senior Director, Research Markets for Silicon Graphics International (SGI).

Read more >