de

Smart speakers at crime scenes could provide valuable clues to police

Information on faces recognised, voice commands and internet searches can be extracted from an Amazon Echo smart assistant without help from the user or manufacturer




de

A riveting exploration of how AI models like ChatGPT changed the world

Supremacy, a new book from tech journalist Parmy Olson, takes us inside the rise of machine learning and AI, and examines the people behind it




de

The deepfakes of Trump and Biden that you are most likely to fall for

Experiments show that viewers can usually identify video deepfakes of famous politicians – but fake audio and text are harder to detect




de

Cold war spy satellites and AI detect ancient underground aqueducts

Archaeologists are using AI and US spy satellite imagery from the cold war to find ancient underground aqueducts that helped humans survive in the desert




de

Tiny nuclear-powered battery could work for decades in space or at sea

A new design for a nuclear battery that generates electricity from the radioactive decay of americium is unprecedentedly efficient




de

AI tweaks to photos and videos can alter our memories

It has become trivially easy to use artificial intelligence to edit images or generate video to remove unwanted objects or beautify scenes, but doing so leads to people misremembering what they have seen




de

Forcing people to change their passwords is officially a bad idea

A US standards agency has issued new guidance saying organisations shouldn’t require users to change their passwords periodically – advice that is backed up by decades of research




de

Google says its AI designs chips better than humans – experts disagree

Google DeepMind claims its AlphaChip AI method can deliver “superhuman” chip designs that are already used in its data centres – but independent experts say public proof is lacking




de

Will semiconductor production be derailed by Hurricane Helene?

Hurricane Helene hit a quartz mine in North Carolina that is key to global semiconductor production, which could impact the entire tech industry. Here is everything we know so far




de

Bill Gates's Netflix series offers some dubious ideas about the future

In What's Next? Bill Gates digs into AI, climate, inequality, malaria and more. But the man looms too large for alternative solutions to emerge, says Bethan Ackerley




de

Hackers can turn your smartphone into an eavesdropping device

Motion sensors in smartphones can be turned into makeshift microphones to eavesdrop on conversations, outsmarting security features designed to stop such attacks




de

Millions of websites could be impacted by UK deal on Chagos Islands

The UK government's decision to return the Chagos Islands to Mauritius surprisingly threatens the extinction of millions of website addresses ending in ".io", and no one is quite sure what will happen next




de

How 'quantum software developer' became a job that actually exists

While quantum computers are still in their infancy, more and more people are training to become quantum software developers




de

Google tool makes AI-generated writing easily detectable

Google DeepMind has been using its AI watermarking method on Gemini chatbot responses for months – and now it’s making the tool available to any AI developer




de

Musical AI harmonises with your voice in a transcendent new exhibition

What happens if AI is trained to write choral music by feeding it a specially created vocal dataset? Moving new exhibition The Call tackles some thorny questions about AI and creativity – and stirs the soul with music




de

Battery-like device made from water and clay could be used on Mars

A new supercapacitor design that uses only water, clay and graphene could source material on Mars and be more sustainable and accessible than traditional batteries




de

Tiny battery made from silk hydrogel can run a mouse pacemaker

A lithium-ion battery made from three droplets of hydrogel is the smallest soft battery of its kind – and it could be used in biocompatible and biodegradable implants




de

AI models fall for the same scams that we do

Large language models can be used to scam humans, but AI is also susceptible to being scammed – and some models are more gullible than others




de

AI helps driverless cars predict how unseen pedestrians may move

A specialised algorithm could help autonomous vehicles track hidden objects, such as a pedestrian, a bicycle or another vehicle concealed behind a parked car




de

How a ride in a friendly Waymo saw me fall for robotaxis

I have a confession to make. After taking a handful of autonomous taxi rides, I have gone from a hater to a friend of robot cars in just a few weeks, says Annalee Newitz




de

Bali chaos as Tiger row deepens

Tigerair has again cancelled flights in and out of Denpasar, and there is no sign of the dispute being resolved.




de

Deep in space, a flicker of life

Scientists have found a new building block of life deep in the cold darkness of interstellar space.




de

Below Deck Sailing Yacht Recap: To Plate or Not to Plate

Gary is up to his usual schtick with Dani. Will he or the new stews ever learn? (Don’t answer that.)




de

Warn Your Children: Deadpool & Wolverine Is Now on Disney+

Just in time to watch with the whole family over Thanksgiving.




de

Amazon Prime Video Lets Freevee Go

Don’t worry, you’ll still be able to watch Jury Duty for freevee.




de

American Sports Story: Aaron Hernandez Finale Recap: Absolute Freedom

The finale doesn’t look to provide a definitive answer to what drove Aaron’s actions, much to the show’s credit.




de

Gary Lineker replacement decided as BBC tipped for rogue MOTD appointment



Express Sport writers have decided who should replace Gary Lineker




de

Howard Webb breaks silence on leaked David Coote Liverpool video as ref suspended



PGMOL chief Howard Webb has responded after referee David Coote was suspended for comments he appeared to make in a video.




de

Video Friday: Disney Robot Dance



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

I think it’s time for us all to admit that some of the most interesting bipedal and humanoid research is being done by Disney.

[ Research Paper from ETH Zurich and Disney Research]

Over the past few months, Unitree G1 robot has been upgraded into a mass production version, with stronger performance, ultimate appearance, and being more in line with mass production requirements.

[ Unitree ]

This robot is from Kinisi Robotics, which was founded by Brennand Pierce, who also founded Bear Robotics. You can’t really tell from this video, but check out the website because the reach this robot has is bonkers.

Kinisi Robotics is on a mission to democratize access to advanced robotics with our latest innovation—a low-cost, dual-arm robot designed for warehouses, factories, and supermarkets. What sets our robot apart is its integration of LLM technology, enabling it to learn from demonstrations and perform complex tasks with minimal setup. Leveraging Brennand’s extensive experience in scaling robotic solutions, we’re able to produce this robot for under $20k, making it a game-changer in the industry.

[ Kinisi Robotics ]

Thanks Bren!

Finally, something that Atlas does that I am also physically capable of doing. In theory.

Okay, never mind. I don’t have those hips.

[ Boston Dynamics ]

Researchers in the Department of Mechanical Engineering at Carnegie Mellon University have created the first legged robot of its size to run, turn, push loads, and climb miniature stairs.

They say it can “run,” but I’m skeptical that there’s a flight phase unless someone sneezes nearby.

[ Carnegie Mellon University ]

The lights are cool and all, but it’s the pulsing soft skin that’s squigging me out.

[ Paper, Robotics Reports Vol.2 ]

Roofing is a difficult and dangerous enough job that it would be great if robots could take it over. It’ll be a challenge though.

[ Renovate Robotics ] via [ TechCrunch ]

Kento Kawaharazuka from JSK Robotics Laboratory at the University of Tokyo wrote in to share this paper, just accepted at RA-L, which (among other things) shows a robot using its flexible hands to identify objects through random finger motion.

[ Paper accepted by IEEE Robotics and Automation Letters ]

Thanks Kento!

It’s one thing to make robots that are reliable, and it’s another to make robots that are reliable and repairable by the end user. I don’t think iRobot gets enough credit for this.

[ iRobot ]

I like competitions where they say, “just relax and forget about the competition and show us what you can do.”

[ MBZIRC Maritime Grand Challenge ]

I kid you not, this used to be my job.

[ RoboHike ]




de

Video Friday: Robots Solving Table Tennis



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Imbuing robots with “human-level performance” in anything is an enormous challenge, but it’s worth it when you see a robot with the skill to interact with a human on a (nearly) human level. Google DeepMind has managed to achieve amateur human-level competence at table tennis, which is much harder than it looks, even for humans. Pannag Sanketi, a tech-lead manager in the robotics team at DeepMind, shared some interesting insights about performing the research. But first, video!

Some behind the scenes detail from Pannag:

  • The robot had not seen any participants before. So we knew we had a cool agent, but we had no idea how it was going to fare in a full match with real humans. To witness it outmaneuver even some of the most advanced players was such a delightful moment for team!
  • All the participants had a lot of fun playing against the robot, irrespective of who won the match. And all of them wanted to play more. Some of them said it will be great to have the robot as a playing partner. From the videos, you can even see how much fun the user study hosts sitting there (who are not authors on the paper) are having watching the games!
  • Barney, who is a professional coach, was an advisor on the project, and our chief evaluator of robot’s skills the way he evaluates his students. He also got surprised by how the robot is always able to learn from the last few weeks’ sessions.
  • We invested a lot in remote and automated 24x7 operations. So not the setup in this video, but there are other cells that we can run 24x7 with a ball thrower.
  • We even tried robot-vs-robot, i.e. 2 robots playing against each other! :) The line between collaboration and competition becomes very interesting when they try to learn by playing with each other.

[ DeepMind ]

Thanks, Heni!

Yoink.

[ MIT ]

Considering how their stability and recovery is often tested, teaching robot dogs to be shy of humans is an excellent idea.

[ Deep Robotics ]

Yes, quadruped robots need tow truck hooks.

[ Paper ]

Earthworm-inspired robots require novel actuators, and Ayato Kanada at Kyushu University has come up with a neat one.

[ Paper ]

Thanks, Ayato!

Meet the AstroAnt! This miniaturized swarm robot can ride atop a lunar rover and collect data related to its health, including surface temperatures and damage from micrometeoroid impacts. In the summer of 2024, with support from our collaborator Castrol, the Media Lab’s Space Exploration Initiative tested AstroAnt in the Canary Islands, where the volcanic landscape resembles the lunar surface.

[ MIT ]

Kengoro has a new forearm that mimics the human radioulnar joint giving it an even more natural badminton swing.

[ JSK Lab ]

Thanks, Kento!

Gromit’s concern that Wallace is becoming too dependent on his inventions proves justified, when Wallace invents a “smart” gnome that seems to develop a mind of its own. When it emerges that a vengeful figure from the past might be masterminding things, it falls to Gromit to battle sinister forces and save his master… or Wallace may never be able to invent again!

[ Wallace and Gromit ]

ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.

[ Kawasaki ]

Engineers at NASA’s Jet Propulsion Laboratory are testing a prototype of IceNode, a robot designed to access one of the most difficult-to-reach places on Earth. The team envisions a fleet of these autonomous robots deploying into unmapped underwater cavities beneath Antarctic ice shelves. There, they’d measure how fast the ice is melting — data that’s crucial to helping scientists accurately project how much global sea levels will rise.

[ IceNode ]

Los Alamos National Laboratory, in a consortium with four other National Laboratories, is leading the charge in finding the best practices to find orphaned wells. These abandoned wells can leak methane gas into the atmosphere and possibly leak liquid into the ground water.

[ LANL ]

Looks like Fourier has been working on something new, although this is still at the point of “looks like” rather than something real.

[ Fourier ]

Bio-Inspired Robot Hands: Altus Dexterity is a collaboration between researchers and professionals from Carnegie Mellon University, UPMC, the University of Illinois and the University of Houston.

[ Altus Dexterity ]

PiPER is a lightweight robotic arm with six integrated joint motors for smooth, precise control. Weighing just 4.2kg, it easily handles a 1.5kg payload and is made from durable yet lightweight materials for versatile use across various environments. Available for just $2,499 USD.

[ AgileX ]

At 104 years old, Lilabel has seen over a century of automotive transformation, from sharing a single car with her family in the 1920s to experiencing her first ride in a robotaxi.

[ Zoox ]

Traditionally, blind juggling robots use plates that are slightly concave to help them with ball control, but it’s also possible to make a blind juggler the hard way. Which, honestly, is much more impressive.

[ Jugglebot ]




de

Unitree Demos New $16k Robot


At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.

Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not the cheapest—of a new wave of advanced AI humanoid robots.




de

Video Friday: HAND to Take on Robotic Hands



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

The National Science Foundation Human AugmentatioN via Dexterity Engineering Research Center (HAND ERC) was announced in August 2024. Funded for up to 10 years and $52 million, the HAND ERC is led by Northwestern University, with core members Texas A&M, Florida A&M, Carnegie Mellon, and MIT, and support from Wisconsin-Madison, Syracuse, and an innovation ecosystem consisting of companies, national labs, and civic and advocacy organizations. HAND will develop versatile, easy-to-use dexterous robot end effectors (hands).

[ HAND ]

The Environmental Robotics Lab at ETH Zurich, in partnership with Wilderness International (and some help from DJI and Audi), is using drones to sample DNA from the tops of trees in the Peruvian rainforest. Somehow, the treetops are where 60 to 90 percent of biodiversity is found, and these drones can help researchers determine what the heck is going on up there.

[ ERL ]

Thanks, Steffen!

1X introduces NEO Beta, “the pre-production build of our home humanoid.”

“Our priority is safety,” said Bernt Børnich, CEO at 1X. “Safety is the cornerstone that allows us to confidently introduce NEO Beta into homes, where it will gather essential feedback and demonstrate its capabilities in real-world settings. This year, we are deploying a limited number of NEO units in selected homes for research and development purposes. Doing so means we are taking another step toward achieving our mission.”

[ 1X ]

We love MangDang’s fun and affordable approach to robotics with Mini Pupper. The next generation of the little legged robot has just launched on Kickstarter, featuring new and updated robots that make it easy to explore embodied AI.

The Kickstarter is already fully funded after just a day or two, but there are still plenty of robots up for grabs.

[ Kickstarter ]

Quadrupeds in space can use their legs to reorient themselves. Or, if you throw one off a roof, it can learn to land on its feet.

To be presented at CoRL 2024.

[ ARL ]

HEBI Robotics, which apparently was once headquartered inside a Pittsburgh public bus, has imbued a table with actuators and a mind of its own.

[ HEBI Robotics ]

Carcinization is a concept in evolutionary biology where a crustacean that isn’t a crab eventually becomes a crab. So why not do the same thing with robots? Crab robots solve all problems!

[ KAIST ]

Waymo is smart, but also humans are really, really dumb sometimes.

[ Waymo ]

The Robotics Department of the University of Michigan created an interactive community art project. The group that led the creation believed that while roboticists typically take on critical and impactful problems in transportation, medicine, mobility, logistics, and manufacturing, there are many opportunities to find play and amusement. The final piece is a grid of art boxes, produced by different members of our robotics community, which offer an eight-inch-square view into their own work with robotics.

[ Michigan Robotics ]

I appreciate that UBTECH’s humanoid is doing an actual job, but why would you use a humanoid for this?

[ UBTECH ]

I’m sure most actuators go through some form of life-cycle testing. But if you really want to test an electric motor, put it into a BattleBot and see what happens.

[ Hardcore Robotics ]

Yes, but have you tried fighting a BattleBot?

[ AgileX ]

In this video, we present collaboration aerial grasping and transportation using multiple quadrotors with cable-suspended payloads. Grasping using a suspended gripper requires accurate tracking of the electromagnet to ensure a successful grasp while switching between different slack and taut modes. In this work, we grasp the payload using a hybrid control approach that switches between a quadrotor position control and a payload position control based on cable slackness. Finally, we use two quadrotors with suspended electromagnet systems to collaboratively grasp and pick up a larger payload for transportation.

[ Hybrid Robotics ]

I had not realized that the floretizing of broccoli was so violent.

[ Oxipital ]

While the RoboCup was held over a month ago, we still wanted to make a small summary of our results, the most memorable moments, and of course an homage to everyone who is involved with the B-Human team: the team members, the sponsors, and the fans at home. Thank you so much for making B-Human the team it is!

[ B-Human ]




de

Video Friday: Jumping Robot Leg, Walking Robot Table



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Researchers at the Max Planck Institute for Intelligent Systems and ETH Zurich have developed a robotic leg with artificial muscles. Inspired by living creatures, it jumps across different terrains in an agile and energy-efficient manner.

[ Nature ] via [ MPI ]

Thanks, Toshi!

ETH Zurich researchers have now developed a fast robotic printing process for earth-based materials that does not require cement. In what is known as “impact printing,” a robot shoots material from above, gradually building a wall. On impact, the parts bond together, and very minimal additives are required.

[ ETH Zurich ]

How could you not be excited to see this happen for real?

[ arXiv paper ]

Can we all agree that sanding, grinding, deburring, and polishing tasks are really best done by robots, for the most part?

[ Cohesive Robotics ]

Thanks, David!

Using doors is a longstanding challenge in robotics and is of significant practical interest in giving robots greater access to human-centric spaces. The task is challenging due to the need for online adaptation to varying door properties and precise control in manipulating the door panel and navigating through the confined doorway. To address this, we propose a learning-based controller for a legged manipulator to open and traverse through doors.

[ arXiv paper ]

Isaac is the first robot assistant that’s built for the home. And we’re shipping it in fall of 2025.

Fall of 2025 is a long enough time from now that I’m not even going to speculate about it.

[ Weave Robotics ]

By patterning liquid metal paste onto a soft sheet of silicone or acrylic foam tape, we developed stretchable versions of conventional rigid circuits (like Arduinos). Our soft circuits can be stretched to over 300% strain (over 4x their length) and are integrated into active soft robots.

[ Science Robotics ] via [ Yale ]

NASA’s Curiosity rover is exploring a scientifically exciting area on Mars, but communicating with the mission team on Earth has recently been a challenge due to both the current season and the surrounding terrain. In this Mars Report, Curiosity engineer Reidar Larsen takes you inside the uplink room where the team talks to the rover.

[ NASA ]

I love this and want to burn it with fire.

[ Carpentopod ]

Very often, people ask us what Reachy 2 is capable of, which is why we’re showing you the manipulation possibilities (through teleoperation) of our technology. The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen Robotics ]

The Scalable Autonomous Robots (ScalAR) Lab is an interdisciplinary lab focused on fundamental research problems in robotics that lie at the intersection of robotics, nonlinear dynamical systems theory, and uncertainty.

[ ScalAR Lab ]

Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of #KawasakiRobotics industrial robots, giving students the necessary skills for future work.

[ Astorino ]

I guess fish-fillet-shaping robots need to exist because otherwise customers will freak out if all their fish fillets are not identical, or something?

[ Flexiv ]

Watch the second episode of the ExoMars Rosalind Franklin rover mission—Europe’s ambitious exploration journey to search for past and present signs of life on Mars. The rover will dig, collect, and investigate the chemical composition of material collected by a drill. Rosalind Franklin will be the first rover to reach a depth of up to two meters below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures.

[ ESA ]




de

One AI Model to Rule All Robots



The software used to control a robot is normally highly adapted to its specific physical set up. But now researchers have created a single general-purpose robotic control policy that can operate robotic arms, wheeled robots, quadrupeds, and even drones.

One of the biggest challenges when it comes to applying machine learning to robotics is the paucity of data. While computer vision and natural language processing can piggyback off the vast quantities of image and text data found on the Internet, collecting robot data is costly and time-consuming.

To get around this, there have been growing efforts to pool data collected by different groups on different kinds of robots, including the Open X-Embodiment and DROID datasets. The hope is that training on diverse robotics data will lead to “positive transfer,” which refers to when skills learned from training on one task help to boost performance on another.

The problem is that robots often have very different embodiments—a term used to describe their physical layout and suite of sensors and actuators—so the data they collect can vary significantly. For instance, a robotic arm might be static, have a complex arrangement of joints and fingers, and collect video from a camera on its wrist. In contrast, a quadruped robot is regularly on the move and relies on force feedback from its legs to maneuver. The kinds of tasks and actions these machines are trained to carry out are also diverse: The arm may pick and place objects, while the quadruped needs keen navigation.

That makes training a single AI model for robots on these large collections of data challenging, says Homer Walke, a Ph.D. student at the University of California, Berkeley. So far, most attempts have either focused on data from a narrower selection of similar robots or researchers have manually tweaked data to make observations from different robots more similar. But in research to be presented at the Conference on Robot Learning (CoRL) in Munich in November, they unveiled a new model called CrossFormer that can train on data from a diverse set of robots and control them just as well as specialized control policies.

“We want to be able to train on all of this data to get the most capable robot,” says Walke. “The main advance in this paper is working out what kind of architecture works the best for accommodating all these varying inputs and outputs.”

How to control diverse robots with the same AI model

The team used the same model architecture that powers large language model, known as a transformer. In many ways, the challenge the researchers were trying to solve is not dissimilar to that facing a chatbot, says Walke. In language modeling, the AI has to to pick out similar patterns in sentences with different lengths and word orders. Robot data can also be arranged in a sequence much like a written sentence, but depending on the particular embodiment, observations and actions vary in length and order too.

“Words might appear in different locations in a sentence, but they still mean the same thing,” says Walke. “In our task, an observation image might appear in different locations in the sequence, but it’s still fundamentally an image and we still want to treat it like an image.”

UC Berkeley/Carnegie Mellon University

Most machine learning approaches work through a sequence one element at a time, but transformers can process the entire stream of data at once. This allows them to analyze the relationship between different elements and makes them better at handling sequences that are not standardized, much like the diverse data found in large robotics datasets.

Walke and his colleagues aren’t the first to train transformers on large-scale robotics data. But previous approaches have either trained solely on data from robotic arms with broadly similar embodiments or manually converted input data to a common format to make it easier to process. In contrast, CrossFormer can process images from cameras positioned above a robot, at head height or on a robotic arms wrist, as well as joint position data from both quadrupeds and robotic arms, without any tweaks.

The result is a single control policy that can operate single robotic arms, pairs of robotic arms, quadrupeds, and wheeled robots on tasks as varied as picking and placing objects, cutting sushi, and obstacle avoidance. Crucially, it matched the performance of specialized models tailored for each robot and outperformed previous approaches trained on diverse robotic data. The team even tested whether the model could control an embodiment not included in the dataset—a small quadcopter. While they simplified things by making the drone fly at a fixed altitude, CrossFormer still outperformed the previous best method.

“That was definitely pretty cool,” says Ria Doshi, an undergraduate student at Berkeley. “I think that as we scale up our policy to be able to train on even larger sets of diverse data, it’ll become easier to see this kind of zero shot transfer onto robots that have been completely unseen in the training.”

The limitations of one AI model for all robots

The team admits there’s still work to do, however. The model is too big for any of the robots’ embedded chips and instead has to be run from a server. Even then, processing times are only just fast enough to support real-time operation, and Walke admits that could break down if they scale up the model. “When you pack so much data into a model it has to be very big and that means running it for real-time control becomes difficult.”

One potential workaround would be to use an approach called distillation, says Oier Mees, a postdoctoral research at Berkley and part of the CrossFormer team. This essentially involves training a smaller model to mimic the larger model, and if successful can result in similar performance for a much smaller computational budget.

But of more importance than the computing resource problem is that the team failed to see any positive transfer in their experiments, as CrossFormer simply matched previous performance rather than exceeding it. Walke thinks progress in computer vision and natural language processing suggests that training on more data could be the key.

Others say it might not be that simple. Jeannette Bohg, a professor of robotics at Stanford University, says the ability to train on such a diverse dataset is a significant contribution. But she wonders whether part of the reason why the researchers didn’t see positive transfer is their insistence on not aligning the input data. Previous research that trained on robots with similar observation and action data has shown evidence of such cross-overs. “By getting rid of this alignment, they may have also gotten rid of this significant positive transfer that we’ve seen in other work,” Bohg says.

It’s also not clear if the approach will boost performance on tasks specific to particular embodiments or robotic applications, says Ram Ramamoorthy, a robotics professor at Edinburgh University. The work is a promising step towards helping robots capture concepts common to most robots, like “avoid this obstacle,” he says. But it may be less useful for tackling control problems specific to a particular robot, such as how to knead dough or navigate a forest, which are often the hardest to solve.




de

Video Friday: Zipline Delivers



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Zipline has (finally) posted some real live footage of its new Platform 2 drone, and while it’s just as weird looking as before, it seems to actually work really well.

[ Zipline ]

I appreciate Disney Research’s insistence on always eventually asking, “okay, but can we get this to work on a real robot in the real world?”

[ Paper from ETH Zurich and Disney Research [PDF] ]

In this video, we showcase our humanoid robot, Nadia, being remotely controlled for boxing training using a simple VR motion capture setup. A remote user takes charge of Nadia’s movements, demonstrating the power of our advanced teleoperation system. Watch as Nadia performs precise boxing moves, highlighting the potential for humanoid robots in dynamic, real-world tasks.

[ IHMC ]

Guide dogs are expensive to train and maintain—if available at all. Because of these limiting factors, relatively few blind people use them. Computer science assistant professor Donghyun Kim and Ph.D candidate Hochul Hwang are hoping to change that with the help of UMass database analyst Gail Gunn and her guide dog, Brawny.

[ University of Massachusetts, Amherst ]

Thanks Julia!

The current paradigm for motion planning generates solutions from scratch for every new problem, which consumes significant amounts of time and computational resources. Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy. We then combine this with lightweight optimization to obtain a safe path for real world deployment.

[ Neural MP ]

A nice mix of NAO and AI for embodied teaching.

[ Aldebaran ]

When retail and logistics giant Otto Group set out to strengthen its operational efficiency and safety, it turned to robotics and automation. The Otto Group has become the first company in Europe to deploy the mobile case handling robot Stretch, which unloads floor-loaded trailers and containers.

[ Boston Dynamics ]

From groceries to last-minute treats, Wing is here to make sure deliveries arrive quickly and safely. Our latest aircraft design features a larger, more standardized box and can carry a higher payload which came directly from customer and partner feedback.

[ Wing ]

It’s the jacket that gets me.

[ Devanthro ]

In this video, we introduce Rotograb, a robotic hand that merges the dexterity of human hands with the strength and efficiency of industrial grippers. Rotograb features a new rotating thumb mechanism, allowing for precision in-hand manipulation and power grasps while being adaptable. The robotic hand was developed by students during “Real World Robotics”, a master course by the Soft Robotics Lab at ETH Zurich.

[ ETH Zurich ]

A small scene where Rémi, our distinguished professor, is teaching chess to the person remotely operating Reachy! The grippers allow for easy and precise handling of chess pieces, even the small ones! The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen ]

Enhancing the adaptability and versatility of unmanned micro aerial vehicles (MAVs) is crucial for expanding their application range. In this article, we present a bimodal reconfigurable robot capable of operating in both regular quadcopter flight mode and a unique revolving flight mode, which allows independent control of the vehicle’s position and roll-pitch attitude.

[ City University Hong Kong ]

The Parallel Continuum Manipulator (PACOMA) is an advanced robotic system designed to replace traditional robotic arms in space missions, such as exploration, in-orbit servicing, and docking. Its design emphasizes robustness against misalignments and impacts, high precision and payload capacity, and sufficient mechanical damping for stable, controlled movements.

[ DFKI Robotics Innovation Center ]

Even the FPV pros from Team BlackSheep do, very occasionally, crash.

[ Team BlackSheep ]

This is a one-hour uninterrupted video of a robot cleaning bathrooms in real time. I’m not sure if it’s practical, but I am sure that it’s impressive, honestly.

[ Somatic ]




de

Detachable Robotic Hand Crawls Around on Finger-Legs



When we think of grasping robots, we think of manipulators of some sort on the ends of arms of some sort. Because of course we do—that’s how (most of us) are built, and that’s the mindset with which we have consequently optimized the world around us. But one of the great things about robots is that they don’t have to be constrained by our constraints, and at ICRA@40 in Rotterdam this week, we saw a novel new Thing: a robotic hand that can detach from its arm and then crawl around to grasp objects that would be otherwise out of reach, designed by roboticists from EPFL in Switzerland.

Fundamentally, robot hands and crawling robots share a lot of similarities, including a body along with some wiggly bits that stick out and do stuff. But most robotic hands are designed to grasp rather than crawl, and as far as I’m aware, no robotic hands have been designed to do both of those things at the same time. Since both capabilities are important, you don’t necessarily want to stick with a traditional grasping-focused hand design. The researchers employed a genetic algorithm and simulation to test a bunch of different configurations in order to optimize for the ability to hold things and to move.

You’ll notice that the fingers bend backwards as well as forwards, which effectively doubles the ways in which the hand (or, “Handcrawler”) can grasp objects. And it’s a little bit hard to tell from the video, but the Handcrawler attaches to the wrist using magnets for alignment along with a screw that extends to lock the hand into place.

“Although you see it in scary movies, I think we’re the first to introduce this idea to robotics.” —Xiao Gao, EPFL

The whole system is controlled manually in the video, but lead author Xiao Gao tells us that they already have an autonomous version (with external localization) working in the lab. In fact, they’ve managed to run an entire grasping sequence autonomously, with the Handcrawler detaching from the arm, crawling to a location the arm can’t reach, picking up an object, and then returning and reattaching itself to the arm again.

Beyond Manual Dexterity: Designing a Multi-fingered Robotic Hand for Grasping and Crawling, by Xiao Gao, Kunpeng Yao, Kai Junge, Josie Hughes, and Aude Billard from EPFL and MIT, was presented at ICRA@40 this week in Rotterdam.




de

Video Friday: ICRA Turns 40



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH
Humanoids 204: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

The interaction between humans and machines is gaining increasing importance due to the advancing degree of automation. This video showcases the development of robotic systems capable of recognizing and responding to human wishes.

By Jana Jost, Sebastian Hoose, Nils Gramse, Benedikt Pschera, and Jan Emmerich from Fraunhofer IML

[ Fraunhofer IML ]

Humans are capable of continuously manipulating a wide variety of deformable objects into complex shapes, owing largely to our ability to reason about material properties as well as our ability to reason in the presence of geometric occlusion in the object’s state. To study the robotic systems and algorithms capable of deforming volumetric objects, we introduce a novel robotics task of continuously deforming clay on a pottery wheel, and we present a baseline approach for tackling such a task by learning from demonstration.

By Adam Hung, Uksang Yoo, Jonathan Francis, Jean Oh, and Jeffrey Ichnowski from CMU Robotics Insittute

[ Carnegie Mellon University Robotics Institute ]

Suction-based robotic grippers are common in industrial applications due to their simplicity and robustness, but [they] struggle with geometric complexity. Grippers that can handle varied surfaces as easily as traditional suction grippers would be more effective. Here we show how a fractal structure allows suction-based grippers to increase conformability and expand approach angle range.

By Patrick O’Brien, Jakub F. Kowalewski, Chad C. Kessens, and Jeffrey Ian Lipton from Northeastern University Transformative Robotics Lab

[ Northeastern University ]

We introduce a newly developed robotic musician designed to play an acoustic guitar in a rich and expressive manner. Unlike previous robotic guitarists, our Expressive Robotic Guitarist (ERG) is designed to play a commercial acoustic guitar while controlling a wide dynamic range, millisecond-level note generation, and a variety of playing techniques such as strumming, picking, overtones, and hammer-ons.

By Ning Yang , Amit Rogel , and Gil Weinberg from Georgia Tech

[ Georgia Tech ]

The iCub project was initiated in 2004 by Giorgio Metta, Giulio Sandini, and David Vernon to create a robotic platform for embodied cognition research. The main goals of the project were to design a humanoid robot, named iCub, to create a community by leveraging on open-source licensing, and implement several basic elements of artificial cognition and developmental robotics. More than 50 iCub have been built and used worldwide for various research projects.

[ Istituto Italiano di Tecnologia ]

In our video, we present SCALER-B, a multi-modal versatile climbing robot that is a quadruped robot capable of standing up, bipedal locomotion, bipedal climbing, and pullups with two finger grippers.

By Yusuke Tanaka, Alexander Schperberg, Alvin Zhu, and Dennis Hong from UCLA

[ Robotics Mechanical Laboratory at UCLA ]

This video explores Waseda University’s innovative journey in developing wind instrument-playing robots, from automated performance to interactive musical engagement. Through demonstrations of technical advancements and collaborative performances, the video illustrates how Waseda University is pushing the boundaries of robotics, blending technology and artistry to create interactive robotic musicians.

By Jia-Yeu Lin and Atsuo Takanishi from Waseda University

[ Waseda University ]

This video presents a brief history of robot painting projects with the intention of educating viewers about the specific, core robotics challenges that people developing robot painters face. We focus on four robotics challenges: controls, the simulation-to-reality gap, generative intelligence, and human-robot interaction. We show how various projects tackle these challenges with quotes from experts in the field.

By Peter Schaldenbrand, Gerry Chen, Vihaan Misra, Lorie Chen, Ken Goldberg, and Jean Oh from CMU

[ Carnegie Mellon University ]

The wheeled humanoid neoDavid is one of the most complex humanoid robots worldwide. All finger joints can be controlled individually, giving the system exceptional dexterity. neoDavids Variable Stiffness Actuators (VSAs) enable very high performance in the tasks with fast collisions, highly energetic vibrations, or explosive motions, such as hammering, using power-tools, e.g. a drill-hammer, or throwing a ball.

[ DLR Institute of Robotics andMechatronics ]

LG Electronics’ journey to commercialize robot navigation technology in various areas such as home, public spaces, and factories will be introduced in this paper. Technical challenges ahead in robot navigation to make an innovation for our better life will be discussed. With the vision on ‘Zero Labor Home’, the next smart home agent robot will bring us next innovation in our lives with the advances of spatial AI, i.e. combination of robot navigation and AI technology.

By Hyoung-Rock Kim, DongKi Noh and Seung-Min Baek from LG

[ LG ]

HILARE stands for: Heuristiques Intégrées aux Logiciels et aux Automatismes dans un Robot Evolutif. The HILARE project started by the end of 1977 at LAAS (Laboratoire d’Automatique et d’Analyse des Systèmes at this time) under the leadership of Georges Giralt. The video features HILARE robot and delivers explanations.

By Aurelie Clodic, Raja Chatila, Marc Vaisset, Matthieu Herrb, Stephy Le Foll, Jerome Lamy, and Simon Lacroix from LAAS/CNRS (Note that the video narration is in French with English subtitles.)

[ LAAS/CNRS ]

Humanoid legged locomotion is versatile, but typically used for reaching nearby targets. Employing a personal transporter (PT) designed for humans, such as a Segway, offers an alternative for humanoids navigating the real world, enabling them to switch from walking to wheeled locomotion for covering larger distances, similar to humans. In this work, we develop control strategies that allow humanoids to operate PTs while maintaining balance.

By Vidyasagar Rajendran, William Thibault, Francisco Javier Andrade Chavez, and Katja Mombaur from University of Waterloo

[ University of Waterloo ]

Motion planning, and in particular in tight settings, is a key problem in robotics and manufacturing. One infamous example for a difficult, tight motion planning problem is the Alpha Puzzle. We present a first demonstration in the real world of an Alpha Puzzle solution with a Universal Robotics UR5e, using a solution path generated from our previous work.

By Dror Livnat, Yuval Lavi, Michael M. Bilevich, Tomer Buber, and Dan Halperin from Tel Aviv University

[ Tel Aviv University ]

Interaction between humans and their environment has been a key factor in the evolution and the expansion of intelligent species. Here we present methods to design and build an artificial environment through interactive robotic surfaces.

By Fabio Zuliani, Neil Chennoufi, Alihan Bakir, Francesco Bruno, and Jamie Paik from EPFL

[ EPFL Reconfigurable Robotics Lab ]

At the intersection of swarm robotics and architecture, we created the Swarm Garden, a novel responsive system to be deployed on façades. The Swarm Garden is an adaptive shading system made of a swarm of robotic modules that respond to humans and the environment while creating beautiful spaces. In this video, we showcase 35 robotic modules that we designed and built for The Swarm Garden.

By Merihan Alhafnawi, Lucia Stein-Montalvo, Jad Bendarkawi, Yenet Tafesse, Vicky Chow, Sigrid Adriaenssens, and Radhika Nagpal from Princeton University

[ Princeton University ]

My team at the University of Southern Denmark has been pioneering the field of self-recharging drones since 2017. These drones are equipped with a robust perception and navigation system, enabling them to identify powerlines and approach them for landing. A unique feature of our drones is their self-recharging capability. They accomplish this by landing on powerlines and utilizing a passively actuated gripping mechanism to secure themselves to the powerline cable.

By Emad Ebeid from University of southern Denmark

[ University of Southern Denmark (SDU) ]

This paper explores the design and implementation of Furnituroids, shape-changing mobile furniture robots that embrace ambiguity to offer multiple and dynamic affordances for both individual and social behaviors.

By Yasuto Nakanishi from Keio University

[ Keio University ]




de

Video Friday: Quadruped Ladder Climbing



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH
Humanoids 2024: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

Not even ladders can keep you safe from quadruped robots anymore.

[ ETH Zürich Robot Systems Lab ]

Introducing Azi (right), the new desktop robot from Engineered Arts Ltd. Azi and Ameca are having a little chat, demonstrating their wide range of expressive capabilities. Engineered Arts desktop robots feature 32 actuators, 27 for facial control alone, and 5 for the neck. They include AI conversational ability including GPT-4o support which makes them great robotic companions.

[ Engineered Arts ]

Quadruped robots that individual researchers can build by themselves are crucial for expanding the scope of research due to their high scalability and customizability. In this study, we develop a metal quadruped robot MEVIUS, that can be constructed and assembled using only materials ordered through e-commerce. We have considered the minimum set of components required for a quadruped robot, employing metal machining, sheet metal welding, and off-the-shelf components only.

[ MEVIUS from JSK Robotics Laboratory ]

Thanks Kento!

Avian perching maneuvers are one of the most frequent and agile flight scenarios, where highly optimized flight trajectories, produced by rapid wing and tail morphing that generate high angular rates and accelerations, reduce kinetic energy at impact. Here, we use optimal control methods on an avian-inspired drone with morphing wing and tail to test a recent hypothesis derived from perching maneuver experiments of Harris’ hawks that birds minimize the distance flown at high angles of attack to dissipate kinetic energy before impact.

[ EPFL Laboratory of Intelligent Systems ]

The earliest signs of bearing failures are inaudible to you, but not to Spot . Introducing acoustic vibration sensing—Automate ultrasonic inspections of rotating equipment to keep your factory humming.

The only thing I want to know is whether Spot is programmed to actually do that cute little tilt when using its acoustic sensors.

[ Boston Dynamics ]

Hear from Jonathan Hurst, our co-founder and Chief Robot Officer, why legs are ideally suited for Digit’s work.

[ Agility Robotics ]

I don’t think “IP67” really does this justice.

[ ANYbotics ]

This paper presents a teleportation system with floating robotic arms that traverse parallel cables to perform long-distance manipulation. The system benefits from the cable-based infrastructure, which is easy to set up and cost-effective with expandable workspace range.

[ EPFL ]

It seems to be just renderings for now, but here’s the next version of Fourier’s humanoid.

[ Fourier ]

Happy Oktoberfest from Dino Robotics!

[ Dino Robotics ]

This paper introduces a learning-based low-level controller for quadcopters, which adaptively controls quadcopters with significant variations in mass, size, and actuator capabilities. Our approach leverages a combination of imitation learning and reinforcement learning, creating a fast-adapting and general control framework for quadcopters that eliminates the need for precise model estimation or manual tuning.

[ HiPeR Lab ]

Parkour poses a significant challenge for legged robots, requiring navigation through complex environments with agility and precision based on limited sensory inputs. In this work, we introduce a novel method for training end-to-end visual policies, from depth pixels to robot control commands, to achieve agile and safe quadruped locomotion.

[ SoloParkour ]




de

Video Friday: Reachy 2



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2024: 14–18 October 2024, ABU DHABI, UAE
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH
Humanoids 2024: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

At ICRA 2024, we sat down with Pollen Robotics to talk about Reachy 2 O_o

[ Pollen Robotics ]

A robot pangolin designed to plant trees is the winner of the 2023 Natural Robotics Contest, which rewards robot designs inspired by nature. As the winning entry, the pangolin—dubbed “Plantolin”—has been brought to life by engineers at the University of Surrey in the United Kingdom. Out of 184 entries, the winning design came from Dorothy, a high school student from California.

Dr. Rob Siddall, a roboticist at the University of Surrey who built Plantolin, said, “In the wild, large animals will cut paths through the overgrowth and move seeds. This doesn’t happen nearly as much in urban areas like the South East of England—so there’s definitely room for a robot to help fill that gap. Dorothy’s brilliant design reminds us how we can solve some of our biggest challenges by looking to nature for inspiration.”

[ Plantolin ]

Our novel targeted throwing end-effector is designed to seamlessly integrate with drones and mobile manipulators. It utilizes elastic energy for efficient picking, placing, and throwing of objects, offering a versatile solution for industrial and warehouse applications. By combining a physics-based model with residual learning, it achieves increased accuracy in targeted throwing, even with previously unseen objects.

[ Throwing Manipulation, multimedia extension for IEEE Robotics and Automation Letters ]

Thanks, Nagamanikandan!

Control of off-road vehicles is challenging due to the complex dynamic interactions with the terrain. Accurate modeling of these interactions is important to optimize driving performance, but the relevant physical phenomena are too complex to model from first principles. Therefore, we present an offline meta-learning algorithm to construct a rapidly-tunable model of residual dynamics and disturbances. We evaluate our method outdoors on different slopes with varying slippage and actuator degradation disturbances, and compare against an adaptive controller that does not use the VFM terrain features.

[ Paper ]

Thanks, Sorina!

Corvus Robotics, a provider of autonomous inventory management systems, announced an updated version of its Corvus One system that brings, for the first time, the ability to fly its drone-powered system in a lights-out distribution center without any added infrastructure like reflectors, stickers, or beacons.

With obstacle detection at its core, the light-weight drone safely flies at walking speed without disrupting workflow or blocking aisles and can preventatively ascend to avoid collisions with people, forklifts, or robots, if necessary. Its advanced barcode scanning can read any barcode symbology in any orientation placed anywhere on the front of cartons or pallets.

[ Corvus Robotics ]

Thanks, Jackie!

The first public walking demo of a new humanoid from Under Control Robotics.

[ Under Control Robotics ]

The ability to accurately and rapidly identify key physiological signatures of injury – such as hemorrhage and airway injuries – proved key to success in the DARPA Triage Challenge Event 1. DART took the top spot in the Systems competition, while Coordinated Robotics topped the leaderboard in the Virtual competition and pulled off the win in the Data competition. All qualified teams are eligible for prizes in the Final Event. These self-funded teams won between $60,000 - $120,000 each for their first-place finishes.

[ DARPA ]

The body structure of an anatomically correct tendon-driven musculoskeletal humanoid is complex. We focused on reciprocal innervation in the human nervous system, and then implemented antagonist inhibition control (AIC) based on the reflex. To verify its effectiveness, we applied AIC to the upper limb of the tendon-driven musculoskeletal humanoid, Kengoro, and succeeded in dangling for 14 minutes and doing pull-ups.

That is also how I do pull-ups.

[ Jouhou System Kougaku Laboratory, University of Tokyo ]

Thanks, Kento!

On June 5, 2024 Digit completed it’s first day of work for GXO Logistics, Inc. as part of regular operations. This is the result of a multi-year agreement between GXO and Agility Robotics to begin deploying Digit in GXO’s logistics operations. This agreement, which follows a proof-of-concept pilot in late 2023, is both the industry’s first formal commercial deployment of humanoid robots and first Robots-as-a-Service (RaaS) deployment of humanoid robots.

[ Agility Robotics ]

Although there is a growing demand for cooking behaviours as one of the expected tasks for robots, a series of cooking behaviours based on new recipe descriptions by robots in the real world has not yet been realised. In this study, we propose a robot system that integrates real-world executable robot cooking behaviour planning using the Large Language Model (LLM) and classical planning of PDDL descriptions, and food ingredient state recognition learning from a small number of data using the Vision-Language model (VLM).

[ JSK Robotics Laboratory, University of Tokyo GitHub ]

Thanks, Naoaki!

This paper introduces a novel approach to interactive robots by leveraging the form-factor of cards to create thin robots equipped with vibrational capabilities for locomotion and haptic feedback. The system is composed of flat-shaped robots with on-device sensing and wireless control, which offer lightweight portability and scalability. Applications include augmented card playing, educational tools, and assistive technology, which showcase CARDinality’s versatility in tangible interaction.

[ AxLab Actuated Experience Lab, University of Chicago ]

Azi reacts in full AI to the scripted skit it did with Ameca.

Azi uses 32 actuators, with 27 to control its silicone face, and 5 for the neck. It uses GPT-4o with a customisable personality.

[ Engineered Arts ]

We are testing a system that includes robots, structural building blocks, and smart algorithms to build large-scale structures for future deep space exploration. In this video, autonomous robots worked as a team to transport material in a mock rail system and simulate a build of a tower at our Roverscape.

[ NASA Ames Research Center ]

In the summer of 2024 HEBI’s intern Aditya Nair worked to add new use-case demos, and improve quality and consistency of the existing demos for our robotic arms! In this video you can see teach and report, augmented reality, gravity compensation, and impedance control gimbal for our robotic arms.

[ HEBI Robotics ]

This video showcases cutting-edge innovations and robotic demonstrations from the Reconfigurable Robotics Lab (RRL) at EPFL. As we are closing the semester, this event brings together the exciting progress and breakthroughs made by our researchers and students over the past months. In this video, you’ll experience a collection of exciting demonstrations, featuring the latest in reconfigurable, soft, and modular robotics, aimed at tackling real-world challenges.

[ EPFL Reconfigurable Robotics Lab ]

Humanoid robot companies are promising that humanoids will fast become our friends, colleagues, employees, and the backbone of our workforce. But how close are we to this reality? What are the key costs associated with operating a humanoid? Can companies deploy them profitably? Will humanoids take our jobs, and if so, what should we be doing to prepare?

[ Human Robot Interaction Podcast ]

According to Web of Science, there have been 1,147,069 publications from 2003 to 2023 that fell under their category of “Computer Science, Artificial Intelligence.” During the same time period, 217,507 publications fell under their “Robotics” category, about 1/5th of the volume. On top of that, Canada’s published Science, Technology, and Innovation Priorities has AI at the top of the “Technology Advanced Canada” list, but robotics is not even listed. AI has also engaged the public’s imagination more so than robotics with “AI” dominating Google Search trends compared to “robotics.” This has us questioning: “Is AI Skyrocketing while Robotics Inches Forward?”

[ Ingenuity Labs RAIS2024 Robotics Debate ]




de

Video Friday: Mobile Robot Upgrades



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ROSCon 2024: 21–23 October 2024, ODENSE, DENMARK
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH
Humanoids 2024: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

One of the most venerable (and recognizable) mobile robots ever made, the Husky, has just gotten a major upgrade.

Shipping early next year.

[ Clearpath Robotics ]

MAB Robotics is developing legged robots for the inspection and maintenance of industrial infrastructure. One of the initial areas for deploying this technology is underground infrastructure, such as water and sewer canals. In these environments, resistance to factors like high humidity and working underwater is essential. To address these challenges, the MAB team has built a walking robot capable of operating fully submerged, based on exceptional self-developed robotics actuators. This innovation overcomes the limitations of current technologies, offering MAB’s first clients a unique service for trenchless inspection and maintenance tasks.

[ MAB Robotics ]

Thanks, Jakub!

The G1 robot can perform a standing long jump of up to 1.4 meters, possibly the longest jump ever achieved by a humanoid robot of its size in the world, standing only 1.32 meters tall.

[ Unitree Robotics ]

Apparently, you can print out a functional four-fingered hand on an inkjet.

[ UC Berkeley ]

We present SDS (``See it. Do it. Sorted’), a novel pipeline for intuitive quadrupedal skill learning from a single demonstration video leveraging the visual capabilities of GPT-4o. We validate our method on the Unitree Go1 robot, demonstrating its ability to execute variable skills such as trotting, bounding, pacing, and hopping, achieving high imitation fidelity and locomotion stability.

[ Robot Perception Lab, University College London ]

You had me at “3D desk octopus.”

[ UIST 2024 ACM Symposium on User Interface Software and Technology ]

Top-notch swag from Dusty Robotics

[ Dusty Robotics ]

I’m not sure how serious this shoes-versus-no-shoes test is, but it’s an interesting result nonetheless.

[ Robot Era ]

Thanks, Ni Tao!

Introducing TRON 1, the first multimodal biped robot! With its innovative “Three-in-One” modular design, TRON 1 can easily switch among Point-Foot, Sole, and Wheeled foot ends.

[ LimX Dynamics ]

Recent works in the robot-learning community have successfully introduced generalist models capable of controlling various robot embodiments across a wide range of tasks, such as navigation and locomotion. However, achieving agile control, which pushes the limits of robotic performance, still relies on specialist models that require extensive parameter tuning. To leverage generalist-model adaptability and flexibility while achieving specialist-level agility, we propose AnyCar, a transformer-based generalist dynamics model designed for agile control of various wheeled robots.

[ AnyCar ]

Discover the future of aerial manipulation with our untethered soft robotic platform with onboard perception stack! Presented at the 2024 Conference on Robot Learning, in Munich, this platform introduces autonomous aerial manipulation that works in both indoor and outdoor environments—without relying on costly off-board tracking systems.

[ Paper ] via [ ETH Zurich Soft Robotics Laboratory ]

Deploying perception modules for human-robot handovers is challenging because they require a high degree of reactivity, generalizability, and robustness to work reliably for diverse cases. Here, we show hardware handover experiments using our efficient and object-agnostic real-time tracking framework, specifically designed for human-to-robot handover tasks with legged manipulators.

[ Paper ] via [ ETH Zurich Robotic Systems Lab ]

Azi and Ameca are killing time, but Azi struggles being the new kid around. Engineered Arts desktop robots feature 32 actuators, 27 for facial control alone, and 5 for the neck. They include AI conversational ability including GPT-4o support, which makes them great robotic companions, even to each other. The robots are following a script for this video, using one of their many voices.

[ Engineered Arts ]

Plato automates carrying and transporting, giving your staff more time to focus on what really matters, improving their quality of life. With a straightforward setup that requires no markers or additional hardware, Plato is incredibly intuitive to use—no programming skills needed.

[ Aldebaran ]

This UPenn GRASP Lab seminar is from Antonio Loquercio, on “Simulation: What made us intelligent will make our robots intelligent.”

Simulation-to-reality transfer is an emerging approach that enables robots to develop skills in simulated environments before applying them in the real world. This method has catalyzed numerous advancements in robotic learning, from locomotion to agile flight. In this talk, I will explore simulation-to-reality transfer through the lens of evolutionary biology, drawing intriguing parallels with the function of the mammalian neocortex. By reframing this technique in the context of biological evolution, we can uncover novel research questions and explore how simulation-to-reality transfer can evolve from an empirically driven process to a scientific discipline.

[ University of Pennsylvania ]




de

Remote Sub Sustains Science Kilometers Underwater



The water column is hazy as an unusual remotely operated vehicle glides over the seafloor in search of a delicate tilt meter deployed three years ago off the west side of Vancouver Island. The sensor measures shaking and shifting in continental plates that will eventually unleash another of the region’s 9.0-scale earthquakes (the last was in 1700). Dwindling charge in the instruments’ loggers threatens the continuity of the data.

The 4-metric-ton, C$8-million (US $5.8-million) remotely operated vehicle (ROV) is 50 meters from its target when one of the seismic science platforms appears on its sonar imaging system, the platform’s hard edges crystallizing from the grainy background like a surgical implant jumping out of an ultrasound image. After easing the ROV to the platform, operators 2,575 meters up at the Pacific’s surface instruct its electromechanical arms and pincer hands to deftly unplug a data logger, then plug in a replacement with a fresh battery.

This mission, executed in early October, marked an exciting moment for Josh Tetarenko, director of ROV operations at North Vancouver-based Canpac Marine Services. Tetarenko is the lead designer behind the new science submersible and recently dubbed it Jenny in homage to Forrest Gump, because the fictional character named all of his boats Jenny. Swapping out the data loggers west of Vancouver Island’s Clayoquot Sound was part of a weeklong shakedown to test Jenny’s unique combination of dexterity, visualization chops, power, and pressure resistance.

Jenny is only the third science ROV designed for subsea work to a depth of 6,000 meters.

By all accounts Jenny sailed through. Tetarenko says the worst they saw was a leaky O-ring and the need to add some spring to a few bumpers. “Usually you see more things come up the first time you dive a vehicle to those depths,” says Tetarenko.

Jenny’s successful maiden cruise is just as important for Victoria, B.C.–based Ocean Networks Canada (ONC), which operates the NEPTUNE undersea observatory. The North-East Pacific Time-series Undersea Networked Experiments array boasts thousands of sensors and instruments, including deep-sea video cameras, seismometers, and robotic rovers sprawled across this corner of Pacific. Most of these are connected to shore via an 812-kilometer power and communications cable. Jenny was custom-designed to perform the annual maintenance and equipment swaps that have kept live data streaming from that cabled observatory nearly continuously for the past 15 years, despite trawler strikes, a fault on its backbone cable, and insults from corrosion, crushing pressures, and fouling.

NEPTUNE remains one of the world’s largest installations for oceanographic science despite a proliferation of such cabled observatories since it went live in 2009. ONC’s open data portal has over 37,000 registered users tapping over 1.5 petabytes of ocean data—information that’s growing in importance with the intensification of climate change and the collapse of marine ecosystems.

Over the course of Jenny’s maiden cruise, her operators swapped devices in and out at half a dozen ONC sites, including at several of NEPTUNE’s five nodes and at one of NEPTUNE’s smaller sister observatories closer to Vancouver.

Inside Jenny

ROV Jenny aboard the Valour, Canpac’s 50-meter offshore workhorse, ahead of October’s NEPTUNE observatory maintenance cruise.Ocean Networks Canada

What makes Jenny so special?

  • Jenny is only the third science ROV designed for subsea work to a depth of 6,000 meters.
  • Motion sensors actively adjust her 7,000-meter-long umbilical cable to counteract topside wave action that would otherwise yank the ROV around at depth and, in rough seas, could damage or snap the cable.
  • Dual high-dexterity manipulator arms are controlled by topside operators via a pair of replica mini-manipulators that mirror the movements.
  • Each arm is capable of picking up objects weighing about 275 kilograms, and the ROV itself can transport equipment weighing up to 3,000 kg.
  • 11 high-resolution cameras deliver 4K video, supported by 300,000 lumens of lighting that can be tuned to deliver the soft red light needed to observe bioluminescence.
  • Dual multibeam sonar systems maximize visibility in turbid water.

Meghan Paulson, ONC’s executive director for observatory operations, says the sonar imaging system will be particularly invaluable during dives to shallower sites where sediments stirred up by waves and weather can cut visibility from meters to centimeters. “It really reduces the risk of running into things accidentally,” says Paulson.

To experience the visibility conditions for yourself, check out recordings of the live video broadcast from the NEPTUNE maintenance cruise. Tetarenko says that next year they hope to broadcast not only the main camera feed but also one of the sonar images.

3D video could be next, according to Canpac ROV pilot and Jenny codesigner, James Barnett. He says they would need to boost the computing power installed topside, to process that “firehose of data,” but insists that real-time 3D is “definitely not impossible.” Tetarenko says the science ROV community is collaborating on software to help make that workable: “3D imagining is kind of the very latest thing that’s being tested on lots of ROV systems right now, but nobody’s really there yet.”

More Than Science

Expansion of the cabled observatory concept is the more certain technological legacy for ONC and NEPTUNE. In fact, the technology has evolved beyond just oceanography applications.

ONC tapped Alcatel Submarine Networks (ASN) to design and build the Neptune backbone and the French firm delivered a system that has reliably delivered multigigabit Ethernet plus 10 kilovolts of direct-current electricity to the deep sea. Today ASN deploys a second-generation subsea power and communications networking solution, developed with the Norwegian international energy company Equinor.

ASN’s “Direct Current/Fiber Optic” or DC/FO system provides the 100-km backbone for the ARCA subsea neutrino observatory near Sicily, in addition to providing control systems for a growing number of offshore oil and gas installations. The latter include projects led by Equinor and BP where DC/FO networks drive the subsea injection of captured carbon dioxide and monitor its storage below the seabed. Future oil and gas projects will increasingly rely on the cables’ power supply to replace the hydraulic lines that have traditionally been used to operate machinery on the seafloor, according to Ronan Michel, ASN’s product line manager for oil and gas solutions.

Michel says DC/FO incorporates important lessons learned from the Neptune installation. And the latter’s existence was a crucial prerequisite. “The DC/FO solution would probably not exist if Neptune Canada would not have been developed,” says Michel. “It probably gave confidence to Equinor that ASN was capable to develop subsea power and coms infrastructure.”




de

Video Friday: Swiss-Mile Robot vs. Humans



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2024: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

Swiss-Mile’s robot (which is really any robot that meets the hardware requirement to run their software) is faster than “most humans.” So what does that mean, exactly?

The winner here is Riccardo Rancan, who doesn’t look like he was trying especially hard—he’s the world champion in high-speed urban orienteering, which is a sport that I did not know existed but sounds pretty awesome.

[ Swiss-Mile ]

Thanks, Marko!

Oh good, we’re building giant fruit fly robots now.

But seriously, this is useful and important research because understanding the relationship between a nervous system and a bunch of legs can only be helpful as we ask more and more of legged robotic platforms.

[ Paper ]

Thanks, Clarus!

Watching humanoids get up off the ground will never not be fascinating.

[ Fourier ]

The Kepler Forerunner K2 represents the Gen 5.0 robot model, showcasing a seamless integration of the humanoid robot’s cerebral, cerebellar, and high-load body functions.

[ Kepler ]

Diffusion Forcing combines the strength of full-sequence diffusion models (like SORA) and next-token models (like LLMs), acting as either or a mix at sampling time for different applications without retraining.

[ MIT ]

Testing robot arms for space is no joke.

[ GITAI ]

Welcome to the Modular Robotics Lab (ModLab), a subgroup of the GRASP Lab and the Mechanical Engineering and Applied Mechanics Department at the University of Pennsylvania under the supervision of Prof. Mark Yim.

[ ModLab ]

This is much more amusing than it has any right to be.

[ Westwood Robotics ]

Let’s go for a walk with Adam at IROS’24!

[ PNDbotics ]

From Reachy 1 in 2023 to our newly launched Reachy 2, our grippers have been designed to enhance precision and dexterity in object manipulation. Some of the models featured in the video are prototypes used for various tests, showing the innovation behind the scenes.

[ Pollen ]

I’m not sure how else you’d efficiently spray the tops of trees? Drones seem like a no-brainer here.

[ SUIND ]

Presented at ICRA40 in Rotterdam, we show the challenges faced by mobile manipulation platforms in the field. We at CSIRO Robotics are working steadily towards a collaborative approach to tackle such challenging technical problems.

[ CSIRO ]

ABB is best known for arms, but it looks like they’re exploring AMRs (autonomous mobile robots) for warehouse operations now.

[ ABB ]

Howie Choset, Lu Li, and Victoria Webster-Wood of the Manufacturing Futures Institute explain their work to create specialized sensors that allow robots to “feel” the world around them.

[ CMU ]

Columbia Engineering Lecture Series in AI: “How Could Machines Reach Human-Level Intelligence?” by Yann LeCun.

Animals and humans understand the physical world, have common sense, possess a persistent memory, can reason, and can plan complex sequences of subgoals and actions. These essential characteristics of intelligent behavior are still beyond the capabilities of today’s most powerful AI architectures, such as Auto-Regressive LLMs.
I will present a cognitive architecture that may constitute a path towards human-level AI. The centerpiece of the architecture is a predictive world model that allows the system to predict the consequences of its actions. and to plan sequences of actions that that fulfill a set of objectives. The objectives may include guardrails that guarantee the system’s controllability and safety. The world model employs a Joint Embedding Predictive Architecture (JEPA) trained with self-supervised learning, largely by observation.

[ Columbia ]




de

Schoolhouse Limbo: How Low Will They Go To 'Better' Grades?

Maryland's new education chief, Carey Wright, an old-school champion of rigorous standards, is pushing back against efforts in other states to boost test scores by essentially lowering their exp




de

Trump Will Reverse Biden's Israel Delusions

Donald Trump will embrace the truth Joe Biden has refused to countenance: Israel's enemies are America's enemies. And when Israel defeats its enemies, America wins.




de

What Should Biden Do? Get a Peace Deal in Ukraine

The end to this bloody stalemate must come with negotiation, and Putin should not wait until Trump is in the White House, says Guardian columnist Simon Jenkins




de

The Election Depleted Us. Storytelling Can Revive Us

As we share our truths and witness each other's, we build unity and community.




de

Demand Senators Publicly Support a Leader Who's Pro-Trump

Hours after Donald Trump wins the most conclusive mandate in 40 years, Mitch McConnell engineers a coup against his agenda by calling early leadership elections in the senate.




de

Harris' Home City Kicked Out Its Progressive Leaders

Oakland's mayor and district attorney were both sent packing in a recall vote. Leaders in other Democratic-run cities should take notice.




de

The Case for Mass Deportations

It's hard to imagine opposing Trump's proposal. Who would want to help murderers and drug dealers who entered the country illegally remain in the United States?




de

Too Many See the Democrats as a Hostile Elite

Even though that perception is partly the creation of right-wing media, the Democrats surely need to hone their identity.