Browse Topic: Robotics
Over the decades, robotics deployments have been driven by the rapid in-parallel research advances in sensing, actuation, simulation, algorithmic control, communication, and high-performance computing among others. Collectively, their integration within a cyber-physical-systems framework has supercharged the increasingly complex realization of the real-time ‘sense-think-act’ robotics paradigm. Successful functioning of modern-day robots relies on seamless integration of increasingly complex systems (coming together at the component-, subsystem-, system- and system-of-system levels) as well as their systematic treatment throughout the life-cycle (from cradle to grave). As a consequence, ‘dependency management’ between the physical/algorithmic inter-dependencies of the multiple system elements is crucial for enabling synergistic (or managing adversarial) outcomes. Furthermore, the steep learning curve for customizing the technology for platform specific deployment discourages domain
Researchers at Universidad Carlos III de Madrid (UC3M) have developed a new soft joint model for robots with an asymmetrical triangular structure and an extremely thin central column. This breakthrough, recently patented, allows for versatility of movement, adaptability and safety, and will have a major impact in the field of robotics.
Accurate object pose estimation refers to the ability of a robot to determine both the position and orientation of an object. It is essential for robotics, especially in pick-and-place tasks, which are crucial in industries such as manufacturing and logistics. As robots are increasingly tasked with complex operations, their ability to precisely determine the six degrees of freedom (6D pose) of objects, position, and orientation, becomes critical. This ability ensures that robots can interact with objects in a reliable and safe manner. However, despite advancements in deep learning, the performance of 6D pose estimation algorithms largely depends on the quality of the data they are trained on.
Drone show accidents highlight the challenges of maintaining safety in what engineers call “multiagent systems” — systems of multiple coordinated, collaborative, and computer-programmed agents, such as robots, drones, and self-driving cars.
Advances in artificial intelligence (AI), machine learning (ML), and sensor fusion drive robotics functionality across many applications, including healthcare. Ongoing innovations in high-speed connectivity, edge computing, network redundancy, and fail-safe procedures crucial to optimizing robotics opportunities. The emergence of natural language processing and emotional AI functionality are poised to propel more intuitive, responsive, and adaptive human-machine interaction.
Los Angeles-based plastics contract manufacturer Kal Plastics deployed UR10e trimming cobot for a fraction of the cost and lead time of a CNC machine, cut trimming time nearly in half, and reduced late shipments to under one percent — all while improving employee safety and growth opportunities.
A team of engineers is on a mission to redefine mobility by providing innovative wearable solutions to physical therapists, orthotic and prosthetic professionals, and individuals experiencing walking impairment and disability. Co-founded by Ray Browning and Zach Lerner, Portland-based startup Biomotum, aims “to empower mobility by energizing every step” through their wearable robotics technology.
In creating a pair of new robots, Cornell researchers cultivated an unlikely component: fungal mycelia. By harnessing mycelia’s innate electrical signals, the researchers discovered a new way of controlling “biohybrid” robots that can potentially react to their environment better than their purely synthetic counterparts.
Researchers are developing soft sensor materials based on ceramics. Such sensors can feel temperature, strain, pressure, or humidity, for instance, which makes them interesting for use in medicine, but also in the field of soft robotics.
Researchers from the School of Engineering of the Hong Kong University of Science and Technology (HKUST) have successfully developed what they believe is the world’s smallest multifunctional biomedical robots. Capable of imaging, high-precision motion, and multifunctional operations like sampling, drug delivery, and laser ablation, the robot offers competitive imaging performance and a tenfold improvement in obstacle detection, paving the way for robotic applications in narrow and challenging channels of the human body, such as the lung’s end bronchi and the oviducts.
Robotics researchers have already made great strides in developing sensors that can perceive changes in position, pressure, and temperature — all of which are important for technologies like wearable devices and human-robot interfaces. But a hallmark of human perception is the ability to sense multiple stimuli at once, and this is something that robotics has struggled to achieve.
Researchers have developed a multifunctional sensor based on semiconductor fibers that emulates the five human senses. Prof. Bonghoon Kim, department of robotics and mechatronics engineering of Daegu Gyeongbuk Institute of Science & Technology (DGIST), conducted the study in collaboration with Prof. Sangwook Kim at KAIST, Prof. Janghwan Kim at Ajou University, and Prof. Jiwoong Kim at Soongsil University. The technology developed in the study is expected to be utilized in fields such as wearables, Internet of Things (IoT), electronic devices, and soft robotics.
Insect cyborgs may sound like science fiction, but it’s a relatively new phenomenon based on using electrical stimuli to control the movement of insects. These hybrid insect computer robots, as they are scientifically called, herald the future of small, high mobile, and efficient devices.
Soft-bending actuators have garnered significant interest in robotics and biomedical engineering due to their ability to mimic the bending motions of natural organisms. Using either positive or negative pressure, most soft pneumatic actuators for bending actuation have modified their design accordingly. In this study, we propose a novel soft bending actuator that utilizes combined positive and negative pressures to achieve enhanced performance and control. The actuator consists of a flexible elastomeric chamber divided into two compartments: a positive pressure chamber and a negative pressure chamber. Controlled bending motion can be achieved by selectively applying positive and negative pressures to the respective chambers. The combined positive and negative pressure allowed for faster response times and increased flexibility compared to traditional soft actuators. Because of its adaptability, controllability, and improved performance can be used for various jobs that call for careful
A fast and agile robotic insect developed by MIT could someday aid in mechanical pollination.
Researchers have helped create a new 3D printing approach for shape-changing materials that are likened to muscles, opening the door for improved applications in robotics as well as biomedical and energy devices.
Soft skin coverings and touch sensors have emerged as a promising feature for robots that are both safer and more intuitive for human interaction, but they are expensive and difficult to make. A recent study demonstrates that soft skin pads doubling as sensors made from thermoplastic urethane can be efficiently manufactured using 3D printers.
A team led by Emily Davidson has reported that they used a class of widely available polymers called thermoplastic elastomers to create soft 3D printed structures with tunable stiffness. Engineers can design the print path used by the 3D printer to program the plastic’s physical properties so that a device can stretch and flex repeatedly in one direction while remaining rigid in another. Davidson, an assistant professor of chemical and biological engineering, says this approach to engineering soft architected materials could have many uses, such as soft robots, medical devices and prosthetics, strong lightweight helmets, and custom high-performance shoe soles.
Nowadays, there are many technologies emerging like firefighting robots, quadcopters, and drones which are capable of operating in hazardous disaster scenarios. In recent years, fire emergencies have become an increasingly serious problem, leading to hundreds of deaths, thousands of injuries, and the destruction of property worth millions of dollars. According to the National Crime Records Bureau (NCRB), India recorded approximately 1,218 fire incidents resulting in 1,694 deaths in 2020 alone. Globally, the World Health Organization (WHO) estimates that fires account for around 265,000 deaths each year, with the majority occurring in low- and middle-income countries. The existing fire-extinguishing systems are often inefficient and lack proper testing, causing significant delays in firefighting efforts. These delays become even more critical in situations involving high-rise buildings or bushfires, where reaching the affected areas is particularly challenging. The leading causes of
Researchers have developed a fully embedded wireless brain neural signal recorder. The device was created by Prof. Jang Kyung-in of the department of robotics and mechanical electronics at DGIST in collaboration with a research team led by Lee Young-jeon of the Korea Research Institute of Bioscience & Biotechnology.
Need a moment of levity? Try watching videos of astronauts falling on the Moon. NASA’s outtakes of Apollo astronauts tripping and stumbling as they bounce in slow motion are delightfully relatable. For MIT engineers, the lunar bloopers also highlight an opportunity to innovate.
While there are concerned voices speaking of a certain disillusionment with AI, as with any potentially game-changing technology, the expectations and fears are often exaggerated. Specifically in relation to industrial robotics, the potential of AI’s impact has, if anything, been muted.
Inspired by a small and slow snail, scientists have developed a robot prototype that may one day scoop up microplastics from the surfaces of oceans, seas, and lakes. The robot’s design is based on the Hawaiian apple snail (Pomacea canaliculate), a common aquarium snail that uses the undulating motion of its foot to drive water surface flow and suck in floating food particles.
Researchers have developed a new soft robot design that engages in three simultaneous behaviors: rolling forward, spinning like a record, and following a path that orbits around a central point. The device, which operates without human or computer control, holds promise for developing soft robotic technologies that can be used to navigate and map unknown environments.
A team led by University of Maryland computer scientists invented a camera mechanism that improves how robots see and react to the world around them. Inspired by how the human eye works, their innovative camera system mimics the tiny involuntary movements used by the eye to maintain clear and stable vision over time. The team’s prototyping and testing of the camera — called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV) — was detailed in a paper published in the journal Science Robotics in May 2024.
LIDAR-based autonomous mobile robots (AMRs) are gradually being used for gas detection in industries. They detect tiny changes in the composition of the environment in indoor areas that is too risky for humans, making it ideal for the detection of gases. This current work focusses on the basic aspect of gas detection and avoiding unwanted accidents in industrial sectors by using an AMR with LIDAR sensor capable of autonomous navigation and MQ2 a gas detection sensor for identifying the leakages including toxic and explosive gases, and can alert the necessary personnel in real-time by using simultaneous localization and mapping (SLAM) algorithm and gas distribution mapping (GDM). GDM in accordance with SLAM algorithm directs the robot towards the leakage point immediately thereby avoiding accidents. Raspberry Pi 4 is used for efficient data processing and hardware part accomplished with PGM45775 DC motor for movements with 2D LIDAR allowing 360° mapping. The adoption of LIDAR-based AMRs
Researchers and engineers at the U.S. Army Combat Capabilities Development Command Chemical Biological Center have developed a prototype system for decontaminating military combat vehicles. U.S. Army Combat Capabilities Development Command, Aberdeen Proving Ground, MD The U.S. Army Combat Capabilities Development Command Chemical Biological Center (DEVCOM CBC) is paving the way and helping the Army transform into a multi-domain force through its modernization and priority research efforts that are linked to the National Defense Strategy and nation's goals. CBC continues to lead in the development of innovative defense technology, including autonomous chem-bio defense solutions designed to enhance accuracy and safety to the warfighter.
The future of wireless technology - from charging devices to boosting communication signals - relies on the antennas that transmit electromagnetic waves becoming increasingly versatile, durable and easy to manufacture. Researchers at Drexel University and the University of British Columbia believe kirigami, the ancient Japanese art of cutting and folding paper to create intricate three-dimensional designs, could provide a model for manufacturing the next generation of antennas. Recently published in the journal Nature Communications, research from the Drexel-UBC team showed how kirigami - a variation of origami - can transform a single sheet of acetate coated with conductive MXene ink into a flexible 3D microwave antenna whose transmission frequency can be adjusted simply by pulling or squeezing to slightly shift its shape.
Researchers have successfully demonstrated the four-dimensional (4D) printing of shape memory polymers in submicron dimensions that are comparable to the wavelength of visible light. 4D printing enables 3D-printed structures to change their configurations over time and is used in a variety of fields such as soft robotics, flexible electronics, and medical devices.
Penn Engineers have developed a new algorithm that allows robots to react to complex physical contact in real time, making it possible for autonomous robots to succeed at previously impossible tasks, like controlling the motion of a sliding object.
Researchers led by Professor Young Min Song from the Gwangju Institute of Science and Technology (GIST) have unveiled a vision system inspired by feline eyes to enhance object detection in various lighting conditions. Featuring a unique shape and reflective surface, the system reduces glare in bright environments and boosts sensitivity in low-light scenarios. By filtering unnecessary details, this technology significantly improves the performance of single-lens cameras, representing a notable advancement in robotic vision capabilities.
Items per page:
50
1 – 50 of 2025