Browse Topic: Augmented / virtual reality
ABSTRACT Operation of a virtual vehicle in order to perform dynamic evaluation of the design can be achieved through the use of augmented reality combined with a simulator. Many uses of virtual reality involve the evaluation of component packaging in a static although interactive manner. That is, the virtual reality (VR) participant can interactively view the virtual environment and perform some minor interactions such as toggling through alternative CAD models for comparison or changing the viewing position to another seat. The immersive 3D simulator system described in this paper enables the VR participant to perform operational tasks such as driving, gunnery and surveillance. Furthermore, this system incorporates augmented reality in order to allow the mixture of the virtual environment with physical controls for operating the virtual vehicle
Crew Station design in the physical realm is complex and expensive due to the cost of fabrication and the time required to reconfigure necessary hardware to conduct studies for human factors and optimization of space claim. However, recent advances in Virtual Reality (VR) and hand tracking technologies have enabled a paradigm shift to the process. The Ground Vehicle System Center has developed an innovative approach using VR technologies to enable a trade space exploration capability which provides crews the ability to place touchscreens and switch panels as desired, then lock them into place to perform a fully recorded simulation of operating the vehicle through a virtual terrain, maneuvering through firing points and engaging moving and static targets during virtual night and day missions with simulated sensor effects for infrared and night vision. Human factors are explored and studied using hand tracking which enables operators to check reach by interacting with virtual components
A research team at The University of Texas at Austin created a noninvasive electroencephalogram (EEG) sensor that was installed in a Meta VR headset that can be worn comfortably for long periods. The EEG measures the brain’s electrical activity during the immersive VR interactions
Researchers have developed SPINDLE, a pioneering robotic rehabilitation system. Combining virtual reality (VR) with customized resistance training, SPINDLE offers personalized therapy to enhance strength and dexterity for activities of daily living (ADLs). Its adaptability and potential for home use represent a major advancement in tremor rehabilitation, with broader healthcare implications
Researchers worldwide are currently working on the next evolution of communication networks, called “beyond 5G” or 6G networks. To enable the near-instantaneous communication needed for applications like augmented reality or the remote control of surgical robots, ultra-high data speeds will be needed on wireless channels. In a study published recently in IEICE Electronics Express, researchers from Osaka University and IMRA AMERICA have found a way to increase these data speeds by reducing the noise in the system through lasers
Designing an automotive seat, it is required to perform a detailed study of anthropometry, which deals with measurement of human individuals and understanding human physical variations. It also requires application-based movement study of driver’s hands, feet’s & overall body movement. It is very difficult to design seat curvatures based on any static manikin-based software. We at VECV, have developed a new concept using mixed reality VR technology to capture all body movements for designing best in class seat curvature to accommodate variety of drivers with different body types. We have designed a specialized static bunk, which has a wide range of seat, steering and ABC paddle adjustments, which are integrated with virtual data. We use to study and capture the data of driving position and other ergonomic postures of wide range of people with different body types on this static bunk according to their comfortable driving posture. In this comfortable driving posture, user is immersed in
A new washable wireless smart textile technology has potential uses in virtual reality and American Sign Language
Engineers at the University of California San Diego have developed electronic “stickers” that measure the force exerted by one object upon another. The force stickers are wireless, run without batteries and fit in tight spaces. That makes them versatile for a wide range of applications, from arming robots with a sense of touch to elevating the immersive experience of VR and AR, making biomedical devices smarter, monitoring the safety of industrial equipment, and improving the accuracy and efficiency of inventory management in warehouses
From the past few years, there is a pressing need for implementation of automatic in-vehicle safety systems to avoid vehicle crashes and fatalities. Development of autonomous emergency braking systems (AEBS) to detect and avoid collisions in such critical moments is of paramount importance. In this paper, AEBS is developed for a four-wheeler system that aims to detect vehicles and controls the ego vehicle based on the expected stooping distance (ESD). This control system aims to react based on the real-time relative distance & speed of the ego vehicle to actuate appropriate braking force. Control systems developed in Altair Activate are co-simulated with CARLA, a virtual reality simulator for autonomous driving research. Various scenarios including low and high-speed car to car motion, urban high and low traffic density environments are simulated to study the robustness of the control system. Further, studies are conducted to evaluate the effectiveness of the systems by varying the
Direct debugging of a vertical takeoff and landing (VTOL) fixed-wing aircraft’s control system can easily result in risk and personnel damage. It is effectively to employ simulation and numerical methods to validate control performance. In this paper, the attitude stabilization controller for VTOL fixed-wing aircraft is designed, and the controller performance is verified by MATLAB and visual simulation software, which significantly increases designed efficiency and safety of the controller. In detail, we first develop the VTOL fixed-wing aircraft’s six degrees of freedom kinematics and dynamics models using Simulink module, and the cascade PID control technique is applied to the VTOL aircraft’s attitude stabilization control. Then the visual simulation program records the flight data and displays the flight course and condition, which can validate the designed controller performance effectively. It can be concluded that the designed VTOL fixed-wing aircraft control visual simulation
Scientists have developed a flexible battery as thin as a human cornea, which stores electricity when it is immersed in saline solution, and which could one day power smart contact lenses. Smart contact lenses are high-tech contact lenses capable of displaying visible information on our corneas and can be used to access augmented reality
For almost as long as it’s been a concept, NASA has been on the cutting edge of virtual reality (VR) technology. However, the space has seen a renaissance since the bulky headsets of the 1990s. Several high-profile companies now use VR for immersive video games and virtual chat rooms, but, to some, this technology has a use beyond entertainment
More pixels! This is a major trend in the display industry. The benefits of 8K or higher resolution TVs may be debatable. For eye catching applications such as AR/VR glasses, more and therefore smaller pixels are required for technical feasibility. Screen door effects and pixel inhomogeneities are easily visible and disturbing for the user on displays that sit closely to the viewer’s eye. μ-LEDs are considered an innovative technology for very high resolutions with pixel sizes of less than 10 μm and equally small pixel pitches. In general, they have the potential to be a groundbreaking display technology – provided production challenges can be solved. Just like OLED displays, μ-LEDs are an emissive display technology, i.e., each single subpixel is in itself a light source. Luminance and color variations between the individual pixels are likely. As this strongly influences the visual quality of the displays, a quality control and calibration of the displays is necessary not only in the
A research team from the National University of Singapore (NUS) Faculty of Science, led by Professor Liu Xiaogang from the Department of Chemistry, has developed a 3D imaging sensor that has an extremely high angular resolution — it can distinguish points of an object separated by an angular distance, of as little as 0.0018°. The sensor operates on a unique angle-to-color conversion principle, allowing it to detect 3D light fields across the X-ray to visible light spectrum
Thanks to artificial intelligence (AI), augmented reality (AR) has long shaped product development across a variety of areas, including the medtech industry. Use of these trends can significantly improve diagnostics and, therefore, treatment. This applies, for example, to surgery and to the adjustment of medication regimens to reflect the patient’s needs. To do this, medical practitioners use recommendations provided by AI, which in turn draws on a broad digital database
MIT researchers have built an augmented reality (AR) headset that gives the wearer X-ray vision. The headset combines computer vision and wireless perception to automatically locate a specific item that is hidden from view, perhaps inside a box or under a pile, and then guide the user to retrieve it
The researchers from the University of Cambridge have developed an algorithm, which gives an accurate measurement of tree diameter, an important measurement used by scientists to monitor forest health and levels of carbon sequestration. The algorithm uses low-cost, low-resolution LiDAR sensors that are incorporated into many mobile phones, and provides results that are just as accurate, but much faster, than manual measurement techniques
Technology capable of replicating the sense of touch — also known as haptic feedback — can greatly enhance human-computer and human-robot interfaces for applications such as medical rehabilitation and virtual reality. A soft artificial skin was developed that provides haptic feedback and, using a self-sensing mechanism, has the potential to instantaneously adapt to a wearer’s movements
Recent events have shown that challenges to the global status quo can arise rapidly, making it imperative that military manufacturers remain agile and prepared to meet new circumstances as they emerge. As author Jim Pattison succinctly stated: “No matter what business you are in, there is change, and it's happening pretty quickly.” The challenges posed to military manufacturers include shorter design and production timetables, the need for greater efficiency in parts replacement and material usage, and an accelerated time to market; challenges that must be met with every technological tool available. The current revolution in manufacturing driven by digital technologies, is transforming the global production landscape. While artificial intelligence, augmented reality and the Industrial Internet of Things (IIoT) are increasing production efficiency and time to market, 3D scanning still has not been exploited by aerospace and defense manufacturers for its full potential to do the same
Industrial vehicles such as forklifts, cranes and tractors have come a long way in terms of applying technology, enhancing performance with improved operation and safety. With the advancements in computer vision, robotics and artificial intelligence (AI), these vehicles now are equipped with functionality that utilizes information to support and optimize performance. One of the key enablers of these advances is the integration of machine learning (ML) derived platforms and powerful computers, neural processing units and cameras integrated into the digital system. System designers and OEMs can get started with AI and computer vision using just a standard Ethernet camera and a Cortex A35 dual- or quad-core next-generation display. Using the detection and recognition of objects, designers can implement and train neural networks to realize new solutions for process guidance, automation, augmented reality and operator awareness. For example, the dual-core CCpilot V700 from CrossControl can
The way in which businesses, enterprises, industry leaders, and consumers utilize technology for everyday tasks is set to undergo one of the most drastic evolutions ever. Just a few short years ago it was nearly impossible to think any other technology could have a greater impact than networked computers, the Internet, or even mobile computing, but now technologies like artificial intelligence (AI), Internet of Things (IoT) and AR/VR are being hyped more than ever
Despite recent supply-chain disruptions, semiconductor buyers are increasingly spoiled for choice when it comes to chip manufacturers. From a chipmaker’s perspective competing for “sockets” in the next gadget — whether it be a mobile phone a VR headset, or an EV — is a game of speed. As a result, competition among producers continued to reduce product development and delivery times — once years to now a few short months
Current simulation tools can assist in expediting the development and validation of powertrain control systems compared to traditional methods that exclusively use physical testing. In this study, a co-simulation platform has been developed by connecting Matlab/Simulink, IPG CarMaker, GT-Suite, and PTV Vissim together to create a virtual calibration and validation environment. The purpose of developing this platform is to save time and reduce costs during vehicle testing by largely replacing the calibration and robustness evaluation carried out on physical vehicles with a simulation-based approach. The platform constructs a virtual environment that replicates the local road network as well as populates the roads with a randomly generated traffic pattern that the simulated vehicle can interact with. By combining the strengths of each software listed above, the developed virtual environment can provide realistic drive scenarios for testing new vehicle control algorithms. One of the
Whether it's on top of a self-driving car or embedded inside the latest gadget, Light Detection and Ranging (LiDAR) systems will likely play an important role in enabling vehicles to see in real time, phones to map three-dimensional images, and enhancing augmented reality in video games. The challenge is that these 3D imaging systems can be bulky, expensive, and hard to shrink down to the size needed for new applications
Despite all the advances in consumer technology over the past decades, one component has remained frustratingly stagnant: the optical lens. Unlike electronic devices, which have gotten smaller and more efficient over the years, the design and underlying physics of today's optical lenses haven't changed much
The race continues between the world’s largest tech leaders and companies to see which one will prevail and power the next generation of tools, technologies, and resources for manufacturing, healthcare, construction, and many other vertical market applications. These companies have been working tirelessly to create changes that will make a significant impact on our world. This all starts with the technological advances that have been made in recent years with artificial intelligence (AI), and immersive mixed reality technologies such as augmented reality (AR) and virtual reality (VR). All these technologies have specific differences, but they’re also now working together in advanced three-dimensional (3D) applications and environments
Aircraft Manufacturing procedures are very critical which always require skillful engineers who must adhere to various process and procedure during daily work. The challenge is not only to identify right tools and manuals but also to keep track of operator usage data and behavior. With Augmented reality (AR), intelligent tools and analytics, we can provide a new lease of life to first-line assembly engineers. The objective is how AR can help us to reduce rework or scrap with the concept of Industry 4.0 [1] were integrating with cutting edge technologies like machine learning and the internet of things (IoT) to meet the fundamental requirements during manufacturing. Smart Manufacturing consists of four major components Cyber-physical systems, IoT, cloud computing and cognitive computing. Our objective is to integrate Augmented reality with IoT sensor to achieve Cyber-physical systems connectivity and fill the gap between the smallest physical assets connected with digital infrastructure
Through the use of magnetic fields, scientists have developed an electronic sensor that can simultaneously process both touchless and tactile stimuli. Prior attempts have so far failed to combine these functions on a single device due to overlapping signals of the various stimuli
Consumers are looking for augmented reality/virtual reality (AR/VR) glasses that are compact and easy to wear, delivering high-quality imagery with socially acceptable optics that don't look like “bug eyes.” Researchers have imprinted freeform optics with a nanophotonic optical element called a metasurface
Micro-optics and nanostructures are key technologies for the latest optoelectronic components in smartphones, smart glasses and vehicles. Some examples used in consumer electronics include microlenses in time-of-flight or ambient light sensors, diffractive optical elements (DOE) for structured light generation, as well as surface relief gratings with nanometer precision in diffractive waveguides that enable new applications like 3D sensing and augmented reality glasses
Items per page:
50
1 – 50 of 622