Throwback to DevTalks Romania @ Sethu Vijayakumar - From Automation to Autonomy: AI driving the Future of Interactive Robotics - DevTalks Romania

Throwback to DevTalks Romania @ Sethu Vijayakumar - From Automation to Autonomy: AI driving the Future of Interactive Robotics

DevTalks Romania @ Sethu Vijayakumar - From Automation to Autonomy: AI driving the Future of Interactive Robotics

Sethu Vijayakumar, Royal Academy of Engineering - Microsoft Research Chair in Robotics, Director of the Edinburgh Centre for Robotics and Fellow of the Royal Society of Edinburgh, Program Co-Director, The Alan Turing Institute, London, UK, was our guest speaker at DevTalks Romania 2022 talking on Main Stage about a powerful subject from present “From Automation to Autonomy: AI driving the Future of Interactive Robotics”.


Here is a sneak peek down below from his session, but also check you can the entire speech right here.

Robotics, perception, and compliance.

Now, there's another big challenge in terms of how we represent things in the world. So classically, we would represent things using Euclidean systems like, you know, X, Y Zed, go from position x one y one, Zed one, 2x, two, y two, z two. But normally, we all know that when we sort of say things or take them out, I would say pick this up from the top of the table and put it underneath that. So is there a way we can exploit concepts like topology and concepts like a mix symbolic and continuous reasoning to do something clever with representation, and that is what some of our efforts in representation have been working on. So, for example, if you want to reach into a box, or if a robot wants to reach into a box, you can do it in many ways, right, you can specify the position X, Y, Zed, and we'll reach into it.

But if you use concepts from topology, to do the motion planning the advantages when an object or your environment changes or moves, the concept of what is inside and outside the box moves with it. And because of that, you can do efficient re planning, you can do efficient adaptation of your plans without having to spend huge amount of compute power to be able to adjust this, and this is by looking at how do we represent your world in a way that is relational.

Humanoid robots and their capabilities.

So, there's a lot of research or they're doing humanoid work and research while walking on flat terrain. But getting real world sensing to interact with a complex system like this, walk on uneven terrain, do motion planning on the fly, is hard. It's still a big challenge. And again, I think, why human I'd you could have real systems like IT systems. There are two reasons to it. One is you will be wanting to challenge the real time nature of some of our control algorithms, some of our systems that we're building, combining perception in a package in a complete system that would address some of the challenges. It's not like we've got external cameras, not like we've got external compute power, but everything is in a right in the small package.

But also, some of the offshoots of some of the technologies that we're developing will help us address some of the challenges in self driving cars in autonomous drones. This is this is the kind of off Shoot. So just stepping back a moment and sort of saying, what does it really take to do some of these things? What are the scientific challenges? If you look through the eyes of a robot, this is what a robot sees, right? It's, it's a bunch of textures, some point cloud information, some 3d LIDAR data. Going from such raw perception to something where you can do something like this, where you can track objects in real time.

For more tech content, don’t forget to save your seat for DevTalks Romania 2024 and get your tickets here: https://www.devtalks.ro/tickets.

DevTalks 2024 Powered by Main sponsor