keynotes

Satoshi Tadokoro

Graduate School of Information Sciences, Tohoku University

Title: Search in Rubble Piles - ImPACT Tough Robotics Challenge

Abstract

Large-scale disasters sometimes destroy artificial structures. Victims are buried in rubble piles and wait for search and rescue. Robotics is expected to support humanitarian activities where risks and difficulties are beyond human capacity. RoboCup has significantly activated this research area. This keynote speech presents some efforts to create advanced robotic systems for gathering information in debris. Active Scope Camera is a serpentine robot searching in debris that crawls and levitates in few-cm gaps, with visual, auditory, and haptic senses for navigation and victim search. Cyber Rescue Canine is a digitally empowered rescue dog wearing a suit to monitor its behavior and conditions and navigate its action remotely. We will discuss how robotics can help mitigate disaster damages in the future.

Bio

Satoshi Tadokoro graduated from the University of Tokyo in 1984. He was an associate professor at Kobe University from 1993 to 2005 and has been a Professor at Tohoku University since 2005. He was a Vice/Deputy Dean of the Graduate School of Information Sciences in 2012-2014 and has been the Director of Tough Cyberphysical AI Research Center since 2019 at Tohoku University. He has been the President of the International Rescue System Institute since 2002 and was the President of IEEE Robotics and Automation Society from 2016 to 2017. He served as the Program Manager of the Ministry of Education's DDT Project on rescue robotics in 2002-2007 and was the Project Manager of Japan Cabinet Office ImPACT Tough Robotics Challenge Project on disaster robotics in 2014-19, having 62 international PIs and 300 researchers that created Cyber Rescue Canine, Dragon Firefighter, etc. His research team at Tohoku University has developed various rescue robots, two of which are called Quince and Active Scope Camera are widely recognized for their contribution to disaster response, including missions in the nuclear reactor buildings of the Fukushima-Daiichi Nuclear Power Station. IEEE Fellow, RSJ Fellow, JSME Fellow, and SICE Fellow.

Angelica Lim

Rosie Lab, School of Computing Science at Simon Fraser University

Title: Social Signals in the Wild: Multimodal Machine Learning for Human-Robot Interaction

Abstract

Science fiction has long promised us interfaces and robots that interact with us as smoothly as humans do - Rosie the Robot from The Jetsons, C-3PO from Star Wars, and Samantha from Her. Today, interactive robots and voice user interfaces are moving us closer to effortless, human-like interactions in the real world. In this talk, I will discuss the opportunities and challenges in creating technologies that can analyze, detect and generate non-verbal communication, including gestures, gaze, auditory signals, and facial expressions. Specifically, I will discuss how we might allow robots to understand human social signals (including emotions, mental states, and attitudes) across cultures as well as in recognize and generate expressions with diversity in mind.

Bio

Dr. Angelica Lim is the Director of the Rosie Lab (www.rosielab.ca), and an Assistant Professor of Professional Practice in the School of Computing Science at Simon Fraser University, Canada. Previously, she led the Emotion and Expressivity teams for the Pepper humanoid robot at SoftBank Robotics. She received her B.Sc. in Computing Science (Artificial Intelligence Specialization) from SFU and a Ph.D. and Masters in Computer Science (Intelligence Science) from Kyoto University, Japan. She has been featured on the BBC, TEDx, hosted a TV documentary on robotics, and was recently featured in Forbes 20 Leading Women in AI.

Manukid Parnichkun

Driverless Car Technologies

Abstract

Driverless car was widely known for the first time from DARPA Grand Challenge in 2004 which was a competition of driverless cars in dessert area. The competition was evolved to DARPA Urban Challenge in 2007 which was a competition in urban area. Since then researches and developments of driverless cars have been conducted extensively for commercialization purposes for examples; driverless car by Google, electrical car with driver assisting functions by Tesla. Today most of car manufacturers research and develop driverless car actively. In Thailand, Thai Robotics Society started its activity on driverless car since 2005. Thailand Intelligent Vehicle Challenge was organized by the society during 2007-2009. The competition became more challenged with the platform of bicycle in BicyRobo Thailand Championship organized during 2010-2012.

This talk will present the key devices and control algorithms behind driverless car technologies. Technologies used by Google car and Tesla will be firstly presented and compared. In the latter part of the talk, the control algorithms used for speed control, heading control, waypoints tracking, and obstacles avoidance of driverless car, unmanned bicycle, and autonomous forklift developed at Asian Institute of Technology (AIT) will be presented.

Bio

Manukid Parnichkun is currently a professor at Mechatronics program, Asian Institute of Technology. He received B.Eng. from Mechanical Engineering, Chulalongkorn University in 1991, M.Eng. and Ph.D. from Precision Machinery Engineering, the University of Tokyo in 1993 and 1996 respectively. He joined Asian Institute of Technology as an assistant professor in 1996. He was promoted to Associate Professor in 2001, and professor in 2016. He supervised and graduated 23 doctoral students, and 194 master students. He was the founding committee of the Thai Robotics Society (TRS) and later became editor-in-chief of the society journal. He was elected to be the president of the Thai Robotics Society during 2003-2005. He organized and chaired several conferences and robot competitions. His research interests are Mechatronics, Robotics, Control, and Measurement.