Microsoft's forthcoming Kinect camera controller for the Xbox 360 plays better with standing humans than it does with couch potatoes. Microsoft says it's still "calibrating" Kinect for seated players and may accomplish that with a little help from Amazon.com.
The Xbox 360 maker may have turned to Amazon's Mechanical Turk service to make Kinect's depth-sensing camera work better when confronted with furniture, pets and people of varying shapes and sizes.
For those unfamiliar with Amazon Mechanical Turk, it's an online service that distributes thousands of small tasks to a "diverse, on-demand, scalable workforce." Those tasks include tagging objects in a photo or video, transcribing audio recordings or writing small articles. It's grunt work and typically pays pennies for small, repetitive tasks.
Kotaku reader Charonchan pointed us to a series of Mechanical Turk jobs—HITs or Human Intelligence Tasks—that appear to be Kinect related. Users are tasked with looking at an images, seeing if there is an identifiable human head in the shot, then tagging the head, shoulders, elbows and hands with a simple skeletal frame.
Many of the images have users seated on coaches or near tables, chairs and Guitar Hero drum controllers. Those images are animated—helpful for picking out details in these low quality, monochromatic shots—and they look like this.
The images are full of variety, filled with sofas, lamps, ottomans, coffee tables, big people, little people, dogs and all kinds of distractions that might confuse Kinect's infrared projector and depth sensor. They're available on Amazon Mechanical Turk for the studying and tagging until next week.
While the HIT listing doesn't specifically mention that this is related to Kinect or Xbox 360, the job requestor links back to the "Upper Body Image Tagger" on Microsoft's Windows.Net site.
Kotaku reached out to Microsoft earlier today to get clarification and comment on whether Kinect is being tweaked with the help of Amazon Mechanical Turk users, but the company has not yet responded.