Windows* 10 is rolling out to millions of customers, and with it the latest version of DirectX*. We wanted to educate game developers on the latest in Microsoft’s Direct3D* graphics API and why it… Read more
Recent Blog Posts
Have you ever reached out and tapped on a device screen or monitor only to be disappointed to discover that it’s not a touch screen? Touch is such a natural and intuitive way to manipulate what’s on our screens that it’s pretty jarring these days when we encounter a screen that only lets us use traditional buttons, keyboards, or mice.
Fortunately this experience is becoming less common as touch screens become an expected feature on all of our personal computing devices—including the PCs, monitors, and devices we use at work. Gartner says that we will have over 2.5 billion touch-enabled devices in use in 2015.1 And their popularity continues to grow.
Touch and the Modern Workplace
I love the touch-enabled systems I use at work. They make it amazingly easy to scroll through files, activate icons, zoom in and out of documents, and markup presentations. If you asked me to choose between touch or a traditional keyboard and mouse, I don’t know if I could give you a favorite.
It’s so natural and intuitive to switch between old-school and touch inputs that I don’t even think about it. I automatically do what feels the most efficient and comfortable for the task at hand.
For me, this makes my daily work tasks feel more personal, more comfortable, more efficient, and honestly, a little more fun.
Touch Greater Productivity
Touch in the workplace also helps workers be more productive. Touch screen PCs, like Intel processer-based All-in-Ones, for example, give teams easy drawing, note-taking, and mark-up capabilities that provide a huge boost to creative collaboration. Plus, being able to instantly choose the input mode that best fits an activity helps streamline tasks and workflows.
This added productivity ultimately helps businesses save money. In fact, if a worker gets even 1 minute per day in added productivity from a touch-enabled All-in-One PC upgrade (a pretty conservative estimate, in my opinion), that boost would pay back the additional cost of the device within 20 months.2
Touch Is Everywhere
Touch screens are also helping to modernize customer-facing solutions like point-of-sale and self-service kiosks. And with more touch devices showing up in the workplace, popular applications like Microsoft Word, Excel, Outlook, and PowerPoint have been optimized for touch. Intel-based All-in-One PCs also support 10-finger multi-touch controls that allow several people to use touch screen functions at once, making collaboration activities like marking up documents or designing quick sketches even more efficient.
The power of touch is revolutionizing collaboration, productivity, and the workplace. It has certainly changed how I work, and I don’t think I could go back to being without it. Thankfully, it appears that touch for business devices is here to stay.
For more information about all the benefits of touch in today’s workplace, download the infographic.
- March 2015, A Principled Technologies Report Commissioned by Intel: Change Your Desktops, Change Your Business.
NEW Intel® Iris™, Iris™ Pro, and HD Graphics Production Driver for Windows* 10 64-bit 220.127.116.11.4256
The Windows 10* production driver 18.104.22.168.4256 has been posted to Intel Download Center at the following direct link to the driver:
64bit – … Read more
Rugged devices are making an impact in the utility industry. To learn more, we recently caught up with Jim Dempsey from Panasonic to find out how utilities can increase customer satisfaction and worker productivity with enterprise grade rugged devices. Watch … Read more >
The Intel® Dual Band Wireless-AC 3165 is Intel’s 2nd generation 802.11ac, dual band, and 1×1 Wi-Fi + Bluetooth® adapter. It’s engineered to be faster, stronger, greener than the previous gen Intel 802.11ac 1×1 products with shared Wi-Fi and Bluetooth antennas, … Read more >
The post Meet the New 2nd Gen Intel® Wireless-AC 3165 (1×1 802.11ac) for Windows 10 appeared first on Technology@Intel.
Guest Contributor, Moran Peri – Predictive Analytics Project Manager and Team Lead at Intel.
Moran specializes in Data Mining and Machine Learning Algorithms solutions, from inception through post-implementation monitoring, in the area of chips, manufacturing and testing.
At least one million Americans and five million people worldwide are coping with Parkinson’s disease (PD). As of today, no objective diagnosis test exists, nor is there a cure. There is so much about PD that defined as mystery.
Hundreds of skilled neurologists, mathematicians and data analysts across the globe are looking for at home real-life PD data set, to exercise their knowledge expertise and come up with innovative research directions.
Intel is working with The Michael J. Fox Foundation for Parkinson’s Research (MJFF) to enable breakthroughs in Parkinson disease research through wearable and big data analytics technologies. A large, virtual data-gathering study is executed these days, gathering sensory data from the PD patients; analyze that data to identify patterns and make generalizations; use insights gained to accelerate the development of therapeutic breakthroughs, and potentially even a cure, for the disease. ~130 Patients in the United States are already streaming sensory date, and amount of data hours gathered so far exceeds 65,000 hours
Utilizing Wearable Technology for PD research
Fox Insight smartphone application, currently available on the Android platform, is connected to a Pebble smartwatch application (Fox Insight Wear) that the users wear on their wrist. The app is designed to bring value to patients on their day-to-day life. The application lets the patient report their medications usage and how they feel, by providing them with an electronic diary that logs their personal subjective overall state. Fox Insight Mobile app also let the patients add medication reminders. Armed with personalized information and graphs as well as medication history provided by the App, patients are able to track and monitor their activity levels, tremor, and night time activity, allowing them to micromanage their regimen to suit their personal preferences and needs. Patients’ time-stamped records of behavior will help researchers correlate patients’ activity, feelings, and medications, to meaningful hypotheses that can later be tested through normal scientific methods.
IoT at the service of PD research
Data is our most valuable asset. The more data, the better. But, it also presents a greater challenge: The wearable internal measurement units (IMU) are capable of recording 150 to 300 samples of sensory data per second per user. With hundreds and thousands of concurrent users streaming data for months and years, there’s an obvious requirement for a system that can collect, store and process massive amounts of data. To fulfill this requirement, we use IoT framework. The Big data tools used in the platform are all scalable and include, among others, a messaging framework (Mosquitto MQTT broker and the AKKA toolkit that allows for distributed parallel processing) Big data storage based on Cloudera distribution for Hadoop, and application interface layer based on the Play framework.
Intel’s contribution to science & research
At Intel, we believe that piloting a Big Data approach to treatment discovery will highlight the possibilities to the healthcare industry. Providing a secure archive of patients’ data, as well as our contributions to algorithm design may help accelerate first stage discoveries.
Although the immediate goal is to improve the quality of care for PD suffers and lead clinical research scientists to potential cure, we believe that the tools, methods and algorithms should be applicable to clinical trials for other afflictions and for other scientific discoveries in general.
To get involved or to learn more about this study, visit https://www.michaeljfox.org/fox-insight-form.html.
On July 30th, Intel hosted the second meeting of the US Department of Commerce Data Advisory Council at our headquarters in Santa Clara, California. The primary building of our headquarters, and where the meeting was located, is named after Robert … Read more >
The post The “New Math” of Data Innovation: Go Do Something Wonderful appeared first on Policy@Intel.
Rooted in a long history of successful collaboration in scientific and technological innovations between Russia and the West, Intel and its ecosystem partners celebrated the opening of a new Intel IoT Ignition Lab in Moscow this summer. The Internet of … Read more >
The post Intel Champions Russian Tech Innovations at IoT Ignition Lab in Moscow appeared first on IoT@Intel.
As many of you know, Meshcentral is an open source remote management web site that allows you to remotely monitor and control a wide range of computers (Windows, OSX, Linux, Android, ChromeOS…)… Read more
Are you ready to continue the journey to software-defined infrastructure? In an earlier post, I explored Two Key Stops along the Road to SDI: Automation and Orchestration. These stops are essential milestones in the trip to the ultimate destination: an SLA Managed data center.
In the SDI maturity model, the Automation and Orchestration stages feed into the SLA Managed stage, but the truth is they alone won’t get you there. To get to your final destination, your applications must be written to take full advantage of a cloud environment. Cloud-aware apps are like the vehicles on the road to the SLA Managed data center.
In more specific terms, cloud-aware apps know what they need to do to fully leverage the automation and orchestration capabilities of the SDI platform. They are written to allow for expansion and contraction automatically to maintain optimal levels of performance, availability, and efficiency. They understand that in a cloud environment, there are multiple routes to a database and multiple systems available to process data. They, in essence, do not worry about redundancy as the automation and orchestration will manage it in the environment.
This is quite unlike the conventional approach to apps. Most of today’s apps are tightly coupled with a particular database and a certain set of infrastructure resources. They require items such as session persistence and connection management. If any of the links break—for example, the app loses its connection to the database—the app goes down and IT admins go into fire-drill mode as they scramble to bring the app back online. Over the past 20 years, we have done our best to automate the fire drill.
In a metaphorical view, we’re talking about the difference between baseball and football. In baseball, things pretty much proceed in a linear and predictable manner. There are few moving parts—there’s one pitcher throwing to one batter—and aside from the occasional base-stealer you pretty much know where all the players are at all times. This is the way things work with the conventional app.
In a cloud environment, things are more football-like. The players are all over the place and the same play can unfold in very different ways. When a receiver runs the wrong route, the play doesn’t come to a stop. The quarterback simply looks for other receivers who are in position to make a play. The cloud-aware app functions like a quarterback who improvises to keep the ball moving down the field.
Here’s where things get harder. It’s not a trivial undertaking to make apps cloud-aware. In the case of legacy apps, the code has to be pretty much rewritten from top to bottom to build in cloud-awareness or the legacy part needs to be wrapped in services so that cloud aware can happen around the legacy portion. So we’re talking about a lot of heavy lifting for your software developers.
The good news is you don’t have to do all of this heavy lifting at once. We’re still quite some time away from the day of the SLA Managed data center. We have to first build the integrated orchestration platforms and automation toolsets that enable a software-defined approach to the data center. The key is to understand that this day is coming, and begin taking steps to make your apps cloud-aware.
Any new apps should be written to be cloud-aware. As for your legacy apps, you won’t be able to rewrite them all at once, so you’re going to need to identify the apps that are most likely to benefit from cloud awareness as you move to software-defined infrastructure or just wrap them in services.
The wrapped applications can help move many critical apps to a more cloud-like environment without rewriting a lot of code. But those apps won’t be able to benefit from all of the goodness of an SLA Managed data center. In an SLA Managed world, software-defined infrastructure and the apps work in tandem to deliver optimal performance with minimal downtime.
These gains are made possible by the ability of the orchestration platform to move workloads and supporting resources around on the fly to meet the policies you set for your applications. When demand spikes, the SDI environment grabs the resources the app needs to keep performance in line with the required service levels, even if that means bursting to a public cloud to gain additional processing power.
If this sounds like IT nirvana, you’ve got it. In the SLA Managed data center, application downtime will be rare, and unpredictable application performance will seem more like a problem from the past than a constant threat in the present. You’ll be able to breathe easier when unusually large crowds of holiday shoppers converge on a particular app, because you’ll know that the backend systems will take care of themselves.
So that’s the 30,000-foot view of the last stretches of the road to SDI. If you consider where we are today and where we need to travel, you can see that we are talking about a long road, and one that can have many unique twists and turns. The key is to think about how you’re going to get to SDI, identify the vehicles that will move you forward, and then begin your journey.
Find me @EdLGoldman, and share your thoughts and comments.
You would have probably known why the Multi-OS Engine of Intel® INDE would address your cross platform needs through this blog. This blog articulates how your Java skills and Java skills only will… Read more
Today marks a great milestone for Intel® Integrated Native Development Experience (Intel® INDE) suite when Intel Corp announced the Multi-OS Engine, a cool new feature of Intel® INDE at the Android… Read more
Smart factories, among the first to move forward with the Internet of Things (IoT) thanks to factory automation, will soon reap the benefits of another successful Intel IoT ecosystem collaboration. Intel Security and Honeywell Process Solutions are teaming up to … Read more >
The post Intel and Honeywell Team Up on IoT Security for Industrial appeared first on IoT@Intel.
“Hello, my name is Jeff Ton and it has been one thousand, two hundred and seventy two days since I last opened Outlook.”
February 6, 2012, an historic date in Indianapolis, Indiana. Yeah, there was some little football game that night, Super Bowl XLVI – New York Giants against the New England Patriots. But that is not the event that made the date historic (though it was great to watch a Manning beat Brady!) What made that date historic was our go-live on Google Apps, our first step in our Journey to the Cloud.
Now that I have offended everyone from the Pacific Northwest and New England, let me rewind and start at the beginning. In 2010, I arrived at Goodwill Industries of Central Indiana. We were running Microsoft Exchange 2003 coupled with Outlook 2010. Back in the day, the adage was “No one ever got fired for buying IBM”, I was in the “No one ever got fired for buying Microsoft camp”. In fact, when I learned the students in our high school were using Google, I was pretty adamant that they use Office. After all, that is what they will be using when they get jobs!
At about this same time, we were switching from Blackberry to Android-based smartphones. We were having horrible sync problems between Exchange and the Androids using ActiveSync. We needed to upgrade our Exchange environment desperately!
As we were beginning to make plans for upgraded servers to support the upgraded exchange environment, I attend my first MIT Sloan CIO Symposium in Boston. Despite the fact that I bleed Colts blue, I actually love Boston, the history, the culture, the vibe; but I digress. At the conference I learned about AC vs. CF projects (see: That Project is a Real Cluster to learn more). I could not fathom a more likely CF project than an email upgrade project. Why not look to the cloud? Since we were doing an upgrade anyway, perhaps this would be the LAST email upgrade we would have to do!
Enter The Google Whisper. For months a former colleague-turned-Google-consultant had been telling me we should check out Google as an email platform. Usually my response was “Google? That’s for kids not an enterprise!” (Ok, now I have offended everyone from Silicon Valley, too!) Everytime I saw him, he would bring it up. I finally agreed to attend one of Google’s roadshow presentations. I came away from that event with an entirely different outlook (pun intended) on Google.
We decided to run a A/B pilot. We would convert 30 employees to the Google Apps platform for 60 days. We would then convert the same 30 employees to BPOS (predecessor to Office 365) for 60 days and may the best man, er, I mean platform, win. We handpicked the employees for the pilot. I purposely selected many who were staunchly in the Microsoft camp and several others who typically resisted change.
At the end of the pilot an amazing thing happened. Not one person on the pilot team wanted to switch off of Google onto BPOS, in fact, each and every person voted to recommend a Google migration to the Executive Team. Unanimous! When was the last time that ever happened in one of your projects?!!?
The decision made, we launched the project to migrate to the cloud! We leveraged this project to also implement our email retention policy (email is retained for five years). The vast majority of the work in the project involved locating all the .PST in our environment, moving them to a central location from network file folders, local drives, and yes, even thumb drives and CDs. Once in that central location, they were uploaded to the Google platform. During this time, we also mirrored our email environment so every internal and external email also went to the Google platform in real time.
The process took about three months, but finally it was Super Bowl Sunday, time for go-live. Now before you think me an ogre of a boss for scheduling a major go-live for Super Bowl Sunday, I should tell you, the date of February 6, 2012 was selected by the project team. Their thought? No one is going to be doing email after the game is over. We announced a blackout period of eight hours beginning at midnight to do our conversion. Boy, were we ever wrong about the length of the blackout period! Our conversion that night took about 20 minutes. 20 minutes and email was flowing again in and out of the Google environment.
Our implementation included email, contacts, calendar, and groups for three domains. We made the decision to keep the other Google Apps available, but not promote them. We also implemented our five year archive and optional email encryption for sensitive communications. The other decision we made (ok, I made) was not to allow the use of Outlook to access Gmail. One of the tenets of our strategic plan was “Any time, Any place, Any device”, I felt having a piece of client software violated that tenet and created additional support issues that were not necessary.
We learned several things as a result of the project. First, search is not sort. If you have used Gmail, then you know there is not a way to sort your Inbox, it relies instead on the power of Google Search. People really like their sort. Took some real handholding to get them comfortable.
Second, Google Groups are not Distribution Lists. We converted all of our Exchange Distribution Lists to Groups. Yes, they do function in somewhat the same way, however, there are many more settings in Groups, settings that can have unexpected consequences. Consequences like the time our CFO replied to an email that had been sent to a Group, and even though he did not use reply all, his reply went to everyone in the Group! We found that setting very quickly and turned it off! (Sorry Dan!)
The third lesson learned was “You cannot train enough”. Yes, we held many classes during the lead up to conversion and continued them long afterwards. A lot of the feedback we had heard (“everyone has Gmail at home, we already know how to use it”) led us to believe once the initial project was complete we didn’t need to continue training. We recently started a series of Google Workshops to continue the learning process. Honestly, I think some of this is generational. Some love to click on links, watch a video, and then use the new functionality. Others, really want a classroom environment. We now offer both.
One of the things that pleasantly surprised us (well, at least me) was the organic adoption of other Google tools. The first shared Google Doc came to me from outside the IT department. The first meeting conducted using Google Hangouts came from the Marketing department. People were finding the apps and falling in love with them.
Today, one thousand, two hundred and seventy-two days later our first step to the cloud is seen as a great accomplishment. It has saved us tens of thousands (if not hundreds of thousands) of dollars, thousands of hours, and has freed up our team to work on those AC projects!
Before I close, I do want to say, we are still a Microsoft shop. We have Office, Windows, Server, SQL Server and many other Microsoft Products. This post is not intended to be a promotion of one product over another. As I said in my previous post, your path may be different from ours. For us, a 3,000 employee non-profit, Google was the right choice. You may find it meets your requirements, or you may find another product is a better fit. The point here is not the specific product, but the product’s delivery method…cloud…SaaS. The project was such a resounding success, we changed one of our Application Guiding Principles. We are now “cloud-first” when selecting a new application or upgrading an existing one. In fact, almost all of the applications we have added in the last three and half years have been SaaS-based, including Workday, Domo, Vonigo, ETO, Facility Dude and more.
Go and get your Google on, later hit your Twitter up
We out workin’ y’all from summer through the winter, bruh
Red eye precision with the speed of a stock car
You’re now tuned in to some Independent Rock Stars
Next month, we will explore a project that did more to take us to a Value-add revenue generating partner than just about any other project. Amplify Your Value: Reap the Rewards!
The series, “Amplify Your Value” explores our five year plan to move from an ad hoc reactionary IT department to a Value-add revenue generating partner. #AmplifyYourValue
We could not have made this journey without the support of several partners, including, but not limited to: Bluelock, Level 3 (TWTelecom), Lifeline Data Centers, Netfor, and CDW. (mentions of partner companies should be considered my personal endorsement based on our experience and on our projects and should NOT be considered an endorsement by my company or its affiliates).
Jeffrey Ton is the SVP of Corporate Connectivity and Chief Information Officer for Goodwill Industries of Central Indiana, providing vision and leadership in the continued development and implementation of the enterprise-wide information technology and marketing portfolios, including applications, information & data management, infrastructure, security and telecommunications.
Find him on LinkedIn.
Follow him on Twitter (@jtongici)
Add him to your circles on Google+
Check out more of his posts on Intel’s IT Peer Network
Read more from Jeff on Rivers of Thought
And the privilege of listening to several industry leaders and – of great interest – a team of FIT’s top senior students, who presented their vision for the store of tomorrow.
Some common threads:
- We’re living in a world of digital screens – brands can either get on board or get left behind.
- Brand success is as much about effective storytelling as it is about product and operational efficiency. And the best brands tell their stories across the screens.
- When it comes to the millennial shopper, it’s about authenticity and trust.
And, of course, technology is the thread that runs through it all.
Jennifer Schmidt, Principal and leader of the Americas Apparel Fashion and Luxury practice at McKinsey & Company, emphasized the importance of storytelling in this important global segment. According to Ms. Schmidt, 50 percent of value creation in fashion and luxury is about perception – the ability of a brand to consistently deliver (in every facet of the business) a differentiating, conversation-building, relationship-building story.
(Those who joined Dr. Paula Payton’s NRF store tour in January will remember her emphasis on storytelling and narrative).
- Ms. Schmidt also spoke to three elements of import in her current strategy work:
- The change in the role of the store – which now shifts from solely emphasizing transactions to brand-building – and with 20-30% fewer doors than before;
- The change in retail formats – which, in developed world retailing, now take five different shapes: 1) flagship store, 2) free-standing format, 3) mini- and urban-free standing, 4) shops within shops and 5) outlet;
- The importance of international expansion, especially to the PRC and South Asia.
Daniella Yacobovsky, co-founder of online jewelry retailer Baublebar, also noted the importance of brand building – and she explained that her brand story is equal parts product and speed. Baublebar works on an eight-week production cycle, achieving previously unheard of turns in jewelry. Data is Ms. Yacobovsky’s friend – she tracks search engine results, web traffic and social media to drive merchandising decisions.
And, last but certainly not least: FIT seniors Rebeccah Amos, Julianne Lemon, Rachel Martin and Alison McDermott, winners of FIT’s Experience Design for Millennials Competition, opined on what makes the best brand experience for millennials. Their unequivocal answer – paired with a lot of good, solid retailing advice – was videos and music.
It’s not just about entertainment. It’s also an issue of trust and authenticity (does a brand’s playlist resonate with you?), which ultimately leads to brand stickiness.
Envision video – and lots of it. On enormous, in-store video walls, on mobile, hand-held devices and on brand YouTube channels. To display products virtually or provide information on how to wear or accessorize them. With in-store video, retailers can orchestrate, curate and simplify, giving shoppers a fast, trusted way to be on trend.
Music? The students suggested that every brand needs a music director. Brand-right soundtracks and playlists and connections to the right bands and music events can be powerful influences on today’s largest consumer group.
Quite the day.
Global Director, Retail Sales
Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
* Other names and brands may be claimed as the property of others.
© 2015 Intel Corporation
When the term design is used in mobile business intelligence (BI), it often refers to the user interface (UI). However, when I consider the question of design in developing a mobile BI strategy, I go beyond what a report or dashboard looks like.
As I wrote in “Mobile BI” Doesn’t Mean “Mobile-Enabled Reports,” when designing a mobile BI solution, we need to consider all facets of user interactions and take a holistic approach in dealing with all aspects of the user experience. Here are three areas of design to consider when developing a mobile BI strategy.
How Should the Mobile BI Assets Be Delivered?
In BI, we typically consider three options for the delivery of assets: push, pull, and hybrid. The basic concept of a “push” strategy is similar to ordering a pizza for home delivery. The “users” passively receive the pizza when it’s delivered, and there’s nothing more that they need to actively do in order to enjoy it (ok, maybe they have to pay for it and tip the driver). Similarly, when users access a report with the push strategy, whether through regular e-mail or mobile BI app, it’s no different than viewing an e-mail message from a colleague.
On the other hand, to have pizza with the pull strategy, users need to get into their cars and drive to the pizza place. They must take action and “retrieve the asset.” Likewise, users need to take action to “pull” the latest report and/or data, whether they log on using the app or mobile browser. The hybrid approach employs a combination of both the push and pull methods.
Selecting the right delivery system for the right role is critical. For example, the push method may be more valuable for executives and sales teams, who travel frequently and may be short on time. However, data updates are less frequent with the push method, so accessing the latest data can’t be critical if you choose this option. In contrast, the “pull” strategy may be more appropriate for analysts and customer service teams, who depend on the latest data.
Additional considerations include data security and enterprise mobility. Does the current BI solution or software support both options? Can the integrity of data security be maintained if data assets are delivered outside the demarcation lines (for example, mobile BI report delivered as an attachment to an e-mail)?
What Are the Format and Functionality of the Mobile BI Assets?
The format deals with the type and category of the asset that is delivered to mobile BI users. What does the end-user receive? Is it a static file in Adobe PDF or Microsoft Excel format with self-contained data, or is it dynamic such as a mobile BI app that employs native device functionality? Is the format limited to data consumption, or does it allow for interactions such as “what-if” scenarios or database write-back capability?
If the format supports exploration, what can I do with it? Can I select different data elements at run time as well as different visualization formats? How to I select different values to filter the result sets, like prompts? Does the format support offline viewing? Is the format conducive to collaboration?
Does the User Interface Optimize the BI Elements?
The UI represents the typical BI elements that are displayed on the screen: page layout, menus, action buttons, orientation, and so on. When you consider the design, decide if the elements really add value or if they’re just pointless visualizations like empty calories in a diet. You want to include just the “meat” of your assets in the UI. More often than not, a simple table with the right highlighting or alerts can do a better job than a colorful pie chart or bar graph.
In addition, the UI covers the navigation among different pages and/or components of a BI asset or package. How do the users navigate from one section to another on a dashboard?
Bottom Line: Design Is Key for the User Experience
The end-to-end mobile BI user experience is a critical component that requires a carefully thought-out design that includes not only soft elements (such as an inviting and engaging UI), but also hard elements (such as the optimal format for the right role and for the right device). Designing the right solution is both art and science.
The technical solution needs to be built and delivered based on specifications and following best practices – that’s the science part. How we go about it? That’s completely art. It requires both ingenuity and critical thinking, since not all components of design come with hard-and-fast rules that we can rely on.
What other facets of the mobile BI user experience do you include in your design considerations?
Stay tuned for my next blog in the Mobile BI Strategy series
This story originally appeared on the SAP Analytics Blog.
The Intel® Dual Band Wireless-AC 8260 is Intel’s 3rd Generation 802.11ac, dual band, and 2×2 Wi-Fi + Bluetooth® 4.2 adapter. It’s engineered to deliver lower power consumption1, improved RF coexistence1, and complete Microsoft Windows 10 support. The M.2 1216 form … Read more >
The post Meet 3rd Gen Intel® Wireless-AC 8260 (2×2 802.11ac Wi-Fi) for Windows 10 appeared first on Technology@Intel.
I spend a great deal of time traveling around the world attending technology events and meeting with our customers. I get to test-drive new laptops, 2 in 1s and tablets on a regular basis. And I get to try out … Read more >
The DPDK community comes together face-to-face at this event and continues a forward looking dialogue on the direction of packet processing in a variety of industry segments such as telecom, cloud,… Read more
Young Hackers, Hipsters, and Hustlers Create Future Product Concepts in Our Summer Innovation Program
Teenagers! Not only did I used to be one, but I now have two of them living in my house—and I’m continually amazed by the unique perspective they bring to conversations and their fearlessness in trying new things. So why … Read more >