{"id":3573,"date":"2018-09-11T07:43:31","date_gmt":"2018-09-11T07:43:31","guid":{"rendered":"https:\/\/athis-consulting.com\/news\/?p=3573"},"modified":"2018-10-02T10:27:46","modified_gmt":"2018-10-02T10:27:46","slug":"robots-can-now-pick-up-any-object-after-inspecting-it","status":"publish","type":"post","link":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/","title":{"rendered":"Robots can now pick up any object after inspecting it"},"content":{"rendered":"<div id=\"text\">\n<p>More<a id=\"article\"><\/a> recently, breakthroughs in computer vision have enabled robots to make basic distinctions between objects, but even then, they don&#8217;t truly understand objects&#8217; shapes, so there&#8217;s little they can do after a quick pick-up.<\/p>\n<p>In a new paper, researchers from MIT&#8217;s Computer Science and Artificial Intelligence Laboratory (CSAIL), say that they&#8217;ve made a key development in this area of work: a system that lets robots inspect random objects, and visually understand them enough to accomplish specific tasks without ever having seen them before.<\/p>\n<p>The system, dubbed &#8220;Dense Object Nets&#8221; (DON), looks at objects as collections of points that serve as &#8220;visual roadmaps&#8221; of sorts. This approach lets robots better understand and manipulate items, and, most importantly, allows them to even pick up a specific object among a clutter of similar objects &#8212; a valuable skill for the kinds of machines that companies like Amazon and Walmart use in their warehouses.<\/p>\n<p>For example, someone might use DON to get a robot to grab onto a specific spot on an object &#8212; say, the tongue of a shoe. From that, it can look at a shoe it has never seen before, and successfully grab its tongue.<\/p>\n<p>&#8220;Many approaches to manipulation can&#8217;t identify specific parts of an object across the many orientations that object may encounter,&#8221; says PhD student Lucas Manuelli, who wrote a new paper about the system with lead author and fellow PhD student Pete Florence, alongside MIT professor Russ Tedrake. &#8220;For example, existing algorithms would be unable to grasp a mug by its handle, especially if the mug could be in multiple orientations, like upright, or on its side.&#8221;<\/p>\n<p>The team views potential applications not just in manufacturing settings, but also in homes. Imagine giving the system an image of a tidy house, and letting it clean while you&#8217;re at work, or using an image of dishes so that the system puts your plates away while you&#8217;re on vacation.<\/p>\n<p>What&#8217;s also noteworthy is that none of the data was actually labeled by humans; rather, the system is &#8220;self-supervised,&#8221; so it doesn&#8217;t require any human annotations.<\/p>\n<h2><strong>Making it easy to grasp<\/strong><\/h2>\n<p>Two common approaches to robot grasping involve either task-specific learning, or creating a general grasping algorithm. These techniques both have obstacles: task-specific methods are difficult to generalize to other tasks, and general grasping doesn&#8217;t get specific enough to deal with the nuances of particular tasks, like putting objects in specific spots.<\/p>\n<p>The DON system, however, essentially creates a series of coordinates on a given object, which serve as a kind of &#8220;visual roadmap&#8221; of the objects, to give the robot a better understanding of what it needs to grasp, and where.<\/p>\n<p>The team trained the system to look at objects as a series of points that make up a larger coordinate system. It can then map different points together to visualize an object&#8217;s 3-D shape, similar to how panoramic photos are stitched together from multiple photos. After training, if a person specifies a point on a object, the robot can take a photo of that object, and identify and match points to be able to then pick up the object at that specified point.<\/p>\n<p>This is different from systems like UC-Berkeley&#8217;s DexNet, which can grasp many different items, but can&#8217;t satisfy a specific request. Imagine an infant at 18-months old, who doesn&#8217;t understand which toy you want it to play with but can still grab lots of items, versus a four-year old who can respond to &#8220;go grab your truck by the red end of it.&#8221;<\/p>\n<p>In one set of tests done on a soft caterpillar toy, a Kuka robotic arm powered by DON could grasp the toy&#8217;s right ear from a range of different configurations. This showed that, among other things, the system has the ability to distinguish left from right on symmetrical objects.<\/p>\n<p>When testing on a bin of different baseball hats, DON could pick out a specific target hat despite all of the hats having very similar designs &#8212; and having never seen pictures of the hats in training data before.<\/p>\n<p>&#8220;In factories robots often need complex part feeders to work reliably,&#8221; says Manuelli. &#8220;But a system like this that can understand objects&#8217; orientations could just take a picture and be able to grasp and adjust the object accordingly.&#8221;<\/p>\n<p>In the future, the team hopes to improve the system to a place where it can perform specific tasks with a deeper understanding of the corresponding objects, like learning how to grasp an object and move it with the ultimate goal of say, cleaning a desk.<\/p>\n<p>The team will present their paper on the system next month at the Conference on Robot Learning in Z\u00fcrich, Switzerland.<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>More recently, breakthroughs in computer vision have enabled robots to make basic distinctions between objects, but even then, they don&#8217;t truly understand objects&#8217; shapes, so there&#8217;s little they can do after a quick pick-up. In a new paper, researchers from MIT&#8217;s Computer Science and Artificial Intelligence Laboratory (CSAIL), say that they&#8217;ve made a key development [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":3580,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"amp_status":"","_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"Robots can now pick up any object after inspecting it","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false}}},"categories":[208,349],"tags":[309,134],"jetpack_publicize_connections":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.8 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Robots can now pick up any object after inspecting it - AthisNews<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Robots can now pick up any object after inspecting it - AthisNews\" \/>\n<meta property=\"og:description\" content=\"More recently, breakthroughs in computer vision have enabled robots to make basic distinctions between objects, but even then, they don&#8217;t truly understand objects&#8217; shapes, so there&#8217;s little they can do after a quick pick-up. In a new paper, researchers from MIT&#8217;s Computer Science and Artificial Intelligence Laboratory (CSAIL), say that they&#8217;ve made a key development [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/\" \/>\n<meta property=\"og:site_name\" content=\"AthisNews\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/AthisNews-271175647014395\/\" \/>\n<meta property=\"article:published_time\" content=\"2018-09-11T07:43:31+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2018-10-02T10:27:46+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"639\" \/>\n\t<meta property=\"og:image:height\" content=\"426\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Mickael Madjour\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/twitter.com\/MMickael82\" \/>\n<meta name=\"twitter:site\" content=\"@Athis_News\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Mickael Madjour\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/\",\"url\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/\",\"name\":\"Robots can now pick up any object after inspecting it - AthisNews\",\"isPartOf\":{\"@id\":\"https:\/\/athis-technologies.com\/news\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg\",\"datePublished\":\"2018-09-11T07:43:31+00:00\",\"dateModified\":\"2018-10-02T10:27:46+00:00\",\"author\":{\"@id\":\"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/4e336579b063b8819b1c5bee603430a8\"},\"breadcrumb\":{\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage\",\"url\":\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg\",\"contentUrl\":\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg\",\"width\":639,\"height\":426},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/athis-technologies.com\/news\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Robots can now pick up any object after inspecting it\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/athis-technologies.com\/news\/#website\",\"url\":\"https:\/\/athis-technologies.com\/news\/\",\"name\":\"AthisNews\",\"description\":\"More than Words\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/athis-technologies.com\/news\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/4e336579b063b8819b1c5bee603430a8\",\"name\":\"Mickael Madjour\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/2u1pp9J__400x400-100x100.jpg\",\"contentUrl\":\"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/2u1pp9J__400x400-100x100.jpg\",\"caption\":\"Mickael Madjour\"},\"description\":\"As an Expert in IT &amp; AI, Mickael brings fresh news about Emerging, Wearable Techs, and IT Innovation. He has 12+ years as a Software Engineer in IT and Telecom companies He is now contributor at Athis News.\",\"sameAs\":[\"https:\/\/athis-technologies.com\/news\/?author=5\",\"https:\/\/www.linkedin.com\/in\/mickael-madjour-3728a3307\/\",\"https:\/\/x.com\/https:\/\/twitter.com\/MMickael82\"],\"url\":\"https:\/\/athis-technologies.com\/news\/author\/mmickael\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Robots can now pick up any object after inspecting it - AthisNews","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/","og_locale":"en_US","og_type":"article","og_title":"Robots can now pick up any object after inspecting it - AthisNews","og_description":"More recently, breakthroughs in computer vision have enabled robots to make basic distinctions between objects, but even then, they don&#8217;t truly understand objects&#8217; shapes, so there&#8217;s little they can do after a quick pick-up. In a new paper, researchers from MIT&#8217;s Computer Science and Artificial Intelligence Laboratory (CSAIL), say that they&#8217;ve made a key development [&hellip;]","og_url":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/","og_site_name":"AthisNews","article_publisher":"https:\/\/www.facebook.com\/AthisNews-271175647014395\/","article_published_time":"2018-09-11T07:43:31+00:00","article_modified_time":"2018-10-02T10:27:46+00:00","og_image":[{"width":639,"height":426,"url":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg","type":"image\/jpeg"}],"author":"Mickael Madjour","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/twitter.com\/MMickael82","twitter_site":"@Athis_News","twitter_misc":{"Written by":"Mickael Madjour","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/","url":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/","name":"Robots can now pick up any object after inspecting it - AthisNews","isPartOf":{"@id":"https:\/\/athis-technologies.com\/news\/#website"},"primaryImageOfPage":{"@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage"},"image":{"@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage"},"thumbnailUrl":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg","datePublished":"2018-09-11T07:43:31+00:00","dateModified":"2018-10-02T10:27:46+00:00","author":{"@id":"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/4e336579b063b8819b1c5bee603430a8"},"breadcrumb":{"@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#primaryimage","url":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg","contentUrl":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg","width":639,"height":426},{"@type":"BreadcrumbList","@id":"https:\/\/athis-technologies.com\/news\/innovation\/ai-big-data\/2018\/robots-can-now-pick-up-any-object-after-inspecting-it\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/athis-technologies.com\/news\/"},{"@type":"ListItem","position":2,"name":"Robots can now pick up any object after inspecting it"}]},{"@type":"WebSite","@id":"https:\/\/athis-technologies.com\/news\/#website","url":"https:\/\/athis-technologies.com\/news\/","name":"AthisNews","description":"More than Words","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/athis-technologies.com\/news\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/4e336579b063b8819b1c5bee603430a8","name":"Mickael Madjour","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/athis-technologies.com\/news\/#\/schema\/person\/image\/","url":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/2u1pp9J__400x400-100x100.jpg","contentUrl":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/2u1pp9J__400x400-100x100.jpg","caption":"Mickael Madjour"},"description":"As an Expert in IT &amp; AI, Mickael brings fresh news about Emerging, Wearable Techs, and IT Innovation. He has 12+ years as a Software Engineer in IT and Telecom companies He is now contributor at Athis News.","sameAs":["https:\/\/athis-technologies.com\/news\/?author=5","https:\/\/www.linkedin.com\/in\/mickael-madjour-3728a3307\/","https:\/\/x.com\/https:\/\/twitter.com\/MMickael82"],"url":"https:\/\/athis-technologies.com\/news\/author\/mmickael\/"}]}},"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/athis-technologies.com\/news\/wp-content\/uploads\/2018\/09\/CSAIL-DON-system-Manuelli-with-Kuka-robot-MIT-00_0.jpg","jetpack_shortlink":"https:\/\/wp.me\/p9Volu-VD","_links":{"self":[{"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/posts\/3573"}],"collection":[{"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/comments?post=3573"}],"version-history":[{"count":0,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/posts\/3573\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/media\/3580"}],"wp:attachment":[{"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/media?parent=3573"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/categories?post=3573"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/athis-technologies.com\/news\/wp-json\/wp\/v2\/tags?post=3573"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}