Browse Topic: Cameras
The automotive industry is rapidly advancing towards autonomous vehicles, making sensors such as Cameras, LiDAR, and RADAR critical components for ensuring constant information exchange between the vehicle and its surrounding environment. However, these sensors are vulnerable to harsh environmental conditions like rain, dirt, snow, and bird droppings, which can impair their functionality and disrupt accurate vehicle maneuvers. To ensure all sensors operate effectively, dedicated cleaning is implemented, particularly for Level 3 and higher autonomous vehicles. It is important to test sensor cleaning mechanisms across different weather conditions and vehicle operating scenarios to ensure reliability and performance. One crucial aspect of testing is tracking the trajectory of the cleaning fluid to ensure it does not cause self-soiling of vehicles and affects the field of view or visibility zones of other components like the windshield. While wind tunnel tests are valuable, digitalizing
Stoneridge displayed its vision for the future of commercial vehicle technology on the SAE COMVEC 2025 exhibit floor. The Innovation Truck showcases the Tier 1 supplier's next-generation vision and driver-assistance technologies designed to enhance driver safety and fleet optimization. Mario Gafencu, product design and evaluation specialist at Stoneridge, gave Truck & Off-Highway Engineering a tech truck walkaround at the event. The first technology Gafencu detailed was the second-generation MirrorEye camera monitor system that's designed to replace the glass mirrors on the sides of a truck.
Planetary and lunar rover exploration missions can encounter environments that do not allow for navigation by typical, stereo camera-based systems. Stereo cameras meet difficulties in areas with low ambient light (even when lit by floodlights), direct sunlight, or washed-out environments. Improved sensors are required for safe and successful rover mobility in harsh conditions. NASA Goddard Space Flight Center has developed a Space Qualified Rover LiDAR (SQRLi) system that will improve rover sensing capabilities in a small, lightweight package. The new SQRLi package is developed to survive the hazardous space environment and provide valuable image data during planetary and lunar rover exploration.
As I'm wont to do come December, with work well underway on the first issue of the new year, I like to take stock of upcoming venues for innovative product reveals and thought-provoking presentations on emerging trends and technologies. Come the first week of January, that means CES in Las Vegas. Traditional equipment manufacturers have increasingly used the event to demonstrate to the broader public that they not only deal in metal but also the digital realm. For example, earlier this year at CES, John Deere revealed its second-generation tech stack featuring camera pods, Nvidia Orin purpose-built processors and Deere's VPUs (vision processing units), along with four new autonomous machines including the 9RX 640 tractor for open-field ag operations. The company is exhibiting again this coming year.
Elbit Systems Haifa, Isreal
Waiting for a wound to heal is incredibly frustrating. First, it must clot; then an immune system response is needed; followed by scabbing and scarring — and that’s not even getting into the pain part.
Measuring the volume of harvested material behind the machine can be beneficial for various agricultural operations, such as baling, dropping, material decomposition, cultivation, and seeding. This paper aims to investigate and determine the volume of material for use in various agricultural operations. This proposed methodology can help to predict the amount of residue available in the field, assess field readiness for the next production cycle, measure residue distribution, determine hay readiness for baling, and evaluate the quantity of hay present in the field, among other applications which would benefit the customer. Efficient post-harvest residue management is essential for sustainable agriculture. This paper presents an Automated Offboard System that leverages Remote Sensing, IoT, Image Processing, and Machine Learning/Deep Learning (ML/DL) to measure the volume of harvested material in real-time. The system integrates onboard cameras and satellite imagery to analyze the field
Researchers have developed a prototype imaging system that could significantly improve doctors’ ability to detect cancerous tissue during endoscopic procedures. This approach combines light-emitting diodes (LEDs) with hyperspectral imaging technology to create detailed maps of tissue properties that are invisible to conventional endoscopic cameras.
In today’s digital age, the use of “Internet-of-Things” devices (embedded with software and sensors) has become widespread. These devices include wireless equipment, autonomous machinery, wearable sensors, and security systems. Because of their intricate structures and properties there is a need to scrutinize them closely to assess their safety and utility and rule out any potential defects. But, at the same time, damage to the device during inspection must be avoided.
Image sensors built into every smartphone and digital camera, distinguish colors like the human eye. In our retinas, individual cone cells recognize red, green and blue (RGB). In image sensors, individual pixels absorb the corresponding wavelengths and convert them into electrical signals.
Northwestern engineers have developed a new system for full-body motion capture — and it doesn’t require specialized rooms, expensive equipment, bulky cameras, or an array of sensors. Instead, it requires a simple mobile device.
Engineers have developed a smart capsule called PillTrek that can measure pH, temperature, and a variety of different biomarkers. It incorporates simple, inexpensive sensors into a miniature wireless electrochemical workstation that relies on low-power electronics. PillTrek measures 7 mm in diameter and 25 mm in length, making it smaller than commercially available capsule cameras used for endoscopy but capable of executing a range of electrochemical measurements.
The U-Shift IV represents the latest evolution in modular urban mobility solutions, offering significant advancements over its predecessors. This innovative vehicle concept introduces a distinct separation between the drive module, known as the driveboard, and the transport capsules. The driveboard contains all the necessary components for autonomous driving, allowing it to operate independently. This separation not only enables versatile applications - such as easily swapping capsules for passenger or goods transportation - but also significantly improves the utilization of the driveboard. By allowing a single driveboard to be paired with different capsules, operational efficiency is maximized, enabling continuous deployment of driveboards while the individual capsules are in use. The primary focus of U-Shift IV was to obtain a permit for operating at the Federal Garden Show 2023. To achieve this goal, we built the vehicle around the specific requirements for semi-public road
With 2D cameras and space robotics algorithms, astronautics engineers at Stanford have created a navigation system able to manage multiple satellites using visual data only. They recently tested it in space for the first time. Stanford University, Stanford, CA Someday, instead of large, expensive individual space satellites, teams of smaller satellites - known by scientists as a “swarm” - will work in collaboration, enabling greater accuracy, agility, and autonomy. Among the scientists working to make these teams a reality are researchers at Stanford University's Space Rendezvous Lab, who recently completed the first-ever in-orbit test of a prototype system able to navigate a swarm of satellites using only visual information shared through a wireless network. “It's a milestone paper and the culmination of 11 years of effort by my lab, which was founded with this goal of surpassing the current state of the art and practice in distributed autonomy in space,” said Simone D'Amico
In October 2024, Kongsberg NanoAvionics discovered damage to their MP42 satellite, and used the discovery as an opportunity to raise awareness on the need to reduce space debris generated by satellites. Kongsberg NanoAvionics, Vilnius, Lithuania Our MP42 satellite, which launched into low Earth orbit (LEO) two and a half years ago aboard the SpaceX Transporter-4 mission, recently took an unexpected hit from a small piece of space debris or micrometeoroid. The impact created a 6 mm hole, roughly the size of a chickpea, in one of its solar panels. Despite this damage, the satellite continued performing its mission without interruption, and we only discovered the impact thanks to an image taken by its onboard selfie camera in October of 2024. It is challenging to pinpoint exactly when the impact occurred because MP42's last selfie was taken a year and a half ago, in April of 2023.
This study presents a novel methodology for optimizing the acoustic performance of rotating machinery by combining scattered 3D sound intensity data with numerical simulations. The method is demonstrated on the rear axle of a truck. Using Scan&Paint 3D, sound intensity data is rapidly acquired over a large spatial area with the assistance of a 3D sound intensity probe and infrared stereo camera. The experimental data is then integrated into far-field radiation simulations, enabling detailed analysis of the acoustic behavior and accurate predictions of far-field sound radiation. This hybrid approach offers a significant advantage for assessing complex acoustic sources, allowing for quick and reliable evaluation of noise mitigation solutions.
Design verification and quality control of automotive components require the analysis of the source location of ultra-short sound events, for instance the engaging event of an electromechanical clutch or the clicking noise of the aluminium frame of a passenger car seat under vibration. State-of-the-art acoustic cameras allow for a frame rate of about 100 acoustic images per second. Considering that most of the sound events introduced above can be far less than 10ms, an acoustic image generated at this rate resembles an hard-to-interpret overlay of multiple sources on the structure under test along with reflections from the surrounding test environment. This contribution introduces a novel method for visualizing impulse-like sound emissions from automotive components at 10x the frame rate of traditional acoustic cameras. A time resolution of less than 1ms eventually allows for the true localization of the initial and subsequent sound events as well as a clear separation of direct from
The segment manipulator machine, a large custom-built apparatus, is used for assembling and disassembling heavy tooling, specifically carbon fiber forms. This complex yet slow-moving machine had been in service for nineteen years, with many control components becoming obsolete and difficult to replace. The customer engaged Electroimpact to upgrade the machine using the latest state-of-the-art controls, aiming to extend the system's operational life by at least another two decades. The program from the previous control system could not be reused, necessitating a complete overhaul.
Items per page:
50
1 – 50 of 601