Sometimes it is hard to believe that the first Star Wars movie came out almost 35 years ago, but over that time, most of the key elements of the Star Wars universe have fully entered the realm of “Pop Culture,” so, you can, in an everyday conversation, talk about “Jedi Knights” & most people will understand the context. And, when one talks about Jedis, it mostly breaks down to two key attributes: Light Sabers and “The Force.” Now I’m sure that, over the years, countless Geeks have attempted solve the elusive “bounded laser beam” problem stands between us & being able to have our own personal Light Saber, but, to date, no one has been successful (yet!!). However, while the quest for 3 feet of pulsating, solidified photons remains next to perpetual energy machines & easy to open bags of airline peanuts on the list of “unsolvable puzzles,” the other side of the Jedi persona seems to be getting closer.
So, admit it, we’ve all done it: you’re sitting on the couch, with the TV remote is about 6 feet away, so you reach out with your hand and concentrate, just to see if, maybe, JUST MAYBE, your midi-chlorian count had grown high enough to enable you to float things through the air. How cool would it be to be able to, through just the power of your mind (and the force), move things like your TV remote? Come on, don’t try to tell me you’ve never tried that (just to check). Well, unfortunately, while true telekinetics remains within the realm of science-fiction, virtual telekinetics will soon be within your reach – pardon my pun. Specifically, thanks to the new “Kinect for Windows” SDK that Microsoft has released, embedded developers working on their next Intelligent Systems idea can leverage this to enable the “virtual you” to be able to move things through your virtual environment as easily as a Jedi can “Force Push” 3 battle droids. Now, one of the first usage cases that come to mind – and that is already being pursued – is the ability to let prospective shoppers easily interact with a wide spectrum of different inventory choices, together with ancillary sensors like cameras, so that person can actually have say the clothes that they are considering get overlaid on their image on the screen to let people try things on quickly. There was a basic example of this shown at the recent NRF Tradeshow:
(another, similar video was Macy’s Beauty Spot, which, while it didn’t use this technology, you could imagine how it could in the future)
Now, with this said, what are the strengths & weakness of this approach? Let’s start by breaking out shopping by the genders: Men & Women…
The Male Point of View
For men, shopping tends to be a task: I need a new tie, or a blue dress shirt, or perhaps I’m getting ready for a ski trip. In general, at least the men I know of, one of the key attributes of a “successful shopping trip” (besides did I get what I needed) is: Did I get my shopping done in the minimum amount of time possible. I’m sure there are some men that enjoy perusing racks & racks of clothing options, but, personally, I had to spell check “perusing” because it is a word I use so infrequently. So, in this situation, the advantages of this approach are easy – maybe the “shopping interface” has a nice menu hierarchy so that I can quickly go from “Tops” >> “Dress Shirts” >> <filter_on> [blue] – and, to top it off, the sensor can already tell my size, so I don’t even have to do that. Perfect, I just went from say 200 “tops” in the store to 50 “dress shirts” and down to just 10 that are in blue. I may even filter out “in stock” to narrow that list down a little more and, after a little quick “Jedi Hand Waving” (say waving my hand <up> = “save” & <down> = “discard”) I may be able to bring the list down to just 3 shirts to actually go try on. At the end, the system (which is tied into the stories inventory) could, at the very least, direct me to where in the store (say by rack number) the 3 shirts I need to try on are. In the “future” – it might even be possible that “robotic fetchers” pull the shirts on my list from a huge, high-density, but low-aesthetics warehouse in the back & brings them to a dressing room for me to try when ready. Done and Done. I’m in & out in record time and probably happy as a clam.
So, in this case, the advantages are:
- Customers are happy because shopping was faster
- The “money earned per time person was in the store” (also called turnover) went up
- The amount of “failed try-ons” (think how that shirt you don’t buy but try on costs the store in terms of re-folding it, etc) goes down
- The store comes away looking cool, edgy, on the front of the learning curve (or, in other words: Positive Brand Image)
The Softer Side of Shopping
Now, let’s look at the ladies. Again, in general – I’m sure there are exceptions – many women I know actually enjoy the “Quest for Clothes” (with the associated bugle call being entirely optional). That being said, I’ve seen various stats on the ratio of say how many jeans the average woman tries on to how many she buys, so, while shopping may be “enjoyable,” frustration (just like with any activity, like golf) is not. So, right off the cuff, you could envision a system where perhaps a woman has already been “digitally measured” (say in a bathing suit) so that the store’s computers (but NO human being) can actually know some fairly exacting measurements, beyond the overly stereotypical Bust/Waist/Hips attributes, thereby allowing an immediate exclusion of clothes that simply won’t fit a particular woman’s shape. While this mean not remove all the frustration of trying on clothes, one could imagine how this approach could greatly reduce that amount of frustration a lady experiences, especially when she is considering a new brand that she’s not tried previously.
Additionally, some future systems could be modeled after the “personal shopper” motif where the system takes the image of the women, considers her exact measurements, and perhaps even previous purchasing history to make “recommendations” for any given customer. Where men may come into stores with a very specific item in mind, this alternate approach for women – “I have some things I think you might like” – might be very well received. It would not be dissimilar to what companies like Amazon have done with their “Customers who viewed this also viewed” which I, personally, have found to be very useful. By sharing anonymous data what are hot sellers or what other women in your demographic or with your body shape may help to influence what a woman buys. Additionally attributes like say hair length or color may further help the system to make recommendations for a lady that would have previously been unobtainable except at perhaps the most influential (and expensive) of Beverly Hill Boutiques. In general, whenever a store can give customers the “high-end experience” at the “mid-range price” – the situation tends to turn out well for everyone.
So, for the ladies, the advantages may be:
- Customers are happier because a reasonable portion of their frustration over misfitting clothes has been removed
- The “money earned per time person was in the store” (also called turnover) went up (still true here)
- The amount of “failed try-ons” (think how many more items a woman may try-on & not buy then men, today) goes down
- The store comes away looking cool, edgy, on the front of the learning curve (perhaps even more true than for men)
Areas for Improvement
However, it wouldn’t be fair to look at the up-side of this approach without looking at the down-side. One of the first ones people will probably mention is costs. Large screens, gesture recognition, perhaps voice recognition – all of this could add up to thousands of dollars – even at the low-end. But, the other side of the equation is, how much do employees cost? Even your nominal mall-store paying employees roughly minimum wage, if you consider that the Arizona minimum wage is $7.65, and you multiply time 40 hours a week or 2000 hours a year, you get to just over $15,000 before you factor in any benefits (like health care or employee discounts) that the human employee gets or other less obvious factors such as training the people (on your products, on appropriate workplace behavior, etc.) plus just the issue of scheduling the people. If your store is open from 8 AM to 9 PM, 6 days a week, plus 10 AM to 6 PM on Sundays, that is 74 hours a week. For a human, that would be a lot of overtime. For a machine, it will charge roughly 11 cents per kilowatt hour to work 74 hours a week – almost certainly a bargain in comparison.
Another factor worth considering is privacy. If, in the future systems, a store’s database not only knows your name, but also details like your exact body type or your purchasing history, then people want to trust that their information is safe. Luckily, this is in no way a new concern for stores, considering they probably already have my name & credit card information – which I care a LOT more about then whether it leaks how many T-shirts I buy a year. So, security & privacy are two aspects that Intel have been supporting for a long time & some of the technologies contained within our vPro-based systems are especially adept at ensure stores can keep information safe, while keeping issues like remote management easy. So, from this standpoint, retailers have to remain vigilant to guard their customers’ information just as the always have – really nothing new here.
So, “virtual-reality” (or more technically, Augmented Reality) will likely be coming to a mall-near-year – perhaps in a year – perhaps in a few years. But just like bar-code readers, this technology will start slowly but, eventually become more common, especially as costs come down because they become more common place. It wasn’t all that long ago (maybe just 20 years) that laser scanner to read bar-codes were only for high-end establishments. And, as with any new technology, there will be triumphs & “issues” as the technology grows & evolves (like imagine a bar-code reader that connected to a bad database & tried to charge you $936 for a box of soda, instead of $9.36).
Other Usages for new “Kinect for Windows” SDK – besides shopping include:
- Controlling larger than life boxing robots (come-on, this is a GIVEN)
- Virtual-Reality Physical Therapy (like for our seniors)
- Controlling bomb-disposal robots (look how complicated it can be)
But, what’s your opinion? Are you “eager” to see the technology put into place or do you view this as just the latest “intrusion” into our lives? Besides shopping and the other examples above, what other things could you see this technology be leveraged? Leave your ideas in the comments below or hit me on Twitter: @Geek8ive!