Browse Topic: Big data
ABSTRACT An increasing pace of technology advancements and recent heavy investment by potential adversaries has eroded the Army’s overmatch and spurred significant changes to the modernization enterprise. Commercial ground vehicle industry solutions are not directly applicable to Army acquisitions because of volume, usage and life cycle requirement differences. In order to meet increasingly aggressive schedule goals while ensuring high quality materiel, the Army acquisition and test and evaluation communities need to retain flexibility and continue to pursue novel analytic methods. Fully utilizing test and field data and incorporating advanced techniques, such as, big data analytics and machine learning can lead to smarter, more rapid acquisition and a better overall product for the Soldier. Logistics data collections during operationally relevant events that were originally intended for the development of condition based maintenance procedures in particular have been shown to provide
ABSTRACT Camber Corporation, under contract with the TACOM Life Cycle Management Command Integrated Logistics Support Center, has developed an innovative process of data mining and analysis to extract information from Army logistics databases, identify top cost and demand drivers, understand trends, and isolate environmental issues. These analysis techniques were initially used to assess TACOM-managed equipment in extended operations in Southwest Asia (SWA). In 2009, at the request of TACOM and the Tank Automotive Research, Development and Engineering Center (TARDEC), these data mining processes were applied to four tactical vehicle platforms in support of Condition Based Maintenance (CBM) initiatives. This paper describes an enhanced data mining and analysis methodology used to identify and rank components as candidates for CBM sensors, assess total cost of repair/replacement and determine potential return on investment in applying CBM technology. Also discussed in this paper is the
The Auto industry has relied upon traditional testing methodologies for product development and Quality testing since its inception. As technology changed, it brought a shift in customer demand for better vehicles with the highest quality standards. With the advent of EVs, OEMs are looking to reduce the going-to-market time for their products to win the EV race. Traditional testing methodologies have relied upon data received from various stakeholders and based on the same tests are planned. The data used is highly subjective and lacks variety. OEMs across the world are betting big on telematics solutions by pushing more and more vehicles with telematics devices as standard fitment. The data from such vehicles which gets generated in high levels of volume, variety and velocity can aid in the new age of vehicle testing. This live data cannot be simply simulated in test environments. The device generates hundreds of signals, frequently in a fraction of seconds. Multiple such signals can
Using current technologies, a single “entry level” vehicle has millions of electrical signals sent through dozens of modules, sensors and actuators, and those signals can be sent over the air, creating a telemetry data that can be used for several ends. One electrical device is set up to have diagnosis, in order to make maintenance feasible and support repair, plus giving improvement directions for specialists on new developments and specifications, but in several cases the diagnosis can only determine the mechanism of failure, but not the event that triggered that failure. Current evaluation method involves teardown, testing and knowledge from the involved specialized team, but this implies in recovering of failed parts, which in larger automakers with thousands of dealers/repair shops, reduces the sample for analyses when there is a systemic issue with one component. This specificity is usual in Propulsions systems, regarding electro-mechanical devices, and sensors, also in
The automotive industry is going through one of its greatest restructuring, the migration from internal combustion engines to electric powered / internet connected vehicles. Adapting to a new consumer who is increasingly demanding and selective may be one of the greatest challenges of this generation, Original Equipment Manufacturers (OEM) have been struggling to keep offering a diversified variety of features to their customers while also maintaining its quality standards. The vehicles leave the factory with an embedded SIM Card and a telematics module, which is an electronic unit to enable communication between the car, data center. Connected vehicles generate tens of gigabytes of data per hour that have the potential to be transformed into valuable information for companies, especially regarding the behavior and desires of drivers. One of the techniques used to gather quality feedback from the customers is the NPS it consists of open questions focused on top-of-mind feedback. Here
The main purpose of this research is to identify how the established quality methodologies, known worldwide as TQC (Total Quality Control) and TQM (Total Quality Management) are supported by the tools of the Quality 4.0 concept that similarly received influence from the disruptive technologies of Industry 4.0 in the last decade. In order to crosscheck the relationship among TQC and TQM and how Quality 4.0 supports these quality systems a qualitative investigation method was adopted through a survey questionnaire applied to one of the most important worldwide automobile company, based also in Brazil, Toyota of Brazil. Based on a literature review and relationship of concepts and synergy among them it was possible analyse and find out conclusions of this research work. The main results were identified as TQC and TQM are very well established concepts of quality and Quality 4.0 concepts and tools have been implemented on a path according to the markets importance prioritization, so then
The Advancement in Connected vehicles Technology in recent years has propelled the use of concepts like the Internet of Things (IoT) and big data in the automotive industry. The progressive electrification of the powertrain has led to the integration of various sensors in the vehicle. The data generated by these sensors are continuously streamed through a telematics device on the vehicle. Data analytics of this data can lead to a variety of applications. Predictive maintenance is one such area where machine learning algorithms are applied to relevant data to predict failure. Field vehicle malfunction or breakdown is costly for manufacturers’ aftermarket services. In the case of commercial vehicles, downtime is the biggest concern for the customer. The use of predictive maintenance techniques can prevent many critical failures by tending to the root cause in the early stages of failure. Engine overheating is one such problem that transpires in diesel engines. Overheating of an engine
The global big data market had a revenue of $162.6 billion in 2021.1 Data is becoming more valuable to companies than gold. However, this data has been used, historically, without contributors’ informed consent and without them seeing a penny from the discoveries the data led to. This article discusses how non-fungible tokens (NFTs) can provide a helpful tool for pharmaceutical companies to track contributed data and compensate contributors accordingly. NFTs are unique, untradable cryptographic assets that can be tracked on a blockchain. NFTs provide a unique traceable token that cannot be replicated, providing a perfect tool to store biodata. The term biodata refers to details regarding a patient’s history and behavioral patterns
Traditional methods of municipal domestic waste analysis and prediction lack precision, while most data’s sample size is not suitable for many neural networks. In this paper, combining the advantage of deep learning methods with the results of association analysis, a waste production prediction method TLSTM is proposed based on long short-term memory(LSTM). It is found that the most influencing factors are population, public cost, household and GDP. Meanwhile, the garbage production in Shanghai will continue to decline in the future, indicating the policy of refuse classification is effective. The R-square index and MSE index of the model were 0.55 and 76571.73 respectively, surpassing other state-of-the-art models. In cooperation with School of Environmental Science and Engineering at Shanghai Jiao Tong University, the dataset comes from the average data of the Shanghai Household Waste Management Regulation from 1980 to 2020. This research method has a certain guiding significance to
Kontron and Intel experts explain how rugged, modular COM Express solutions reduce complexity and allow retrofit of autonomous systems on heavy mobile equipment. Continually transformed with more than a century's advances in capabilities, hydraulics and fuel efficiency, today's heavy mobile equipment must also become more intelligent and better connected. Technologies such as artificial intelligence (AI), deep learning, big data, GPS, 5G and computer vision are proving their mettle - empowering far more efficient ways of carrying out unique and demanding tasks via advanced telematics, advanced driver assistance systems (ADAS) or varying levels of autonomy. Heavy mobile equipment (HME) that can gather and apply data in real time operates and makes decisions in ways that humans cannot. This evolution toward automation promises not only leadership for manufacturers of more advanced systems, but also increased safety, economy, efficiency and ecological compatibility
In this study it will show, big data analysis and user survey of driving records were conducted to investigate frequency of use and ease of operation of the regen paddle to control one-pedal driving system in electric vehicle. According to 3.8 million driving record big data analysis result, it was found that the driver manipulates 3.31 times on average during a single trip, mainly during the early stages of driving. According to user observation research result in 41.8% of participants did not used or used less than 5 time of regen paddle during one single trip. Also 336 participants, which occupy 83%, responded that the regen paddle manipulation for one-pedal driving was inconvenient. In conclusion, because of the use frequency of the regen paddle is low and the operation of regen paddle is inconvenient. It seems necessary to change the design of the regen paddle
After so many supply chain and logistics challenges since the start of the COVID-19 pandemic, we’ve seen numerous headlines proclaiming a new era of reshoring and the end of lean production and just-in-time manufacturing strategies
In order to improve the energy efficiency of hybrid electric vehicles and to improve the effectiveness of energy management algorithms, it is very important to predict the future changes of traffic parameters based on traffic big data, so as to predict the future vehicle speed change and to reduce the friction brake. Under the framework of deep learning, this paper establishes a Long Short-Term Memory (LSTM) artificial neural network traffic flow parameter prediction model based on time series through keras library to predict the future state of traffic flow. The comparison experiment between Long Short-Term Memory (LSTM) artificial neural network model and Gate Recurrent Unit (GRU) model using US-101 data set shows that LSTM has higher accuracy in predicting traffic flow velocity
Small Form Factor, Modular Data Centers at the Edge of the Battlefield In order to achieve and maintain warfighting overmatch, coordinate deployed forces and enable new capabilities, the US Army, Air Force, and Navy are actively looking to new programs such as Joint All Domain Command and Control (JADC2) to ensure warfighters have maximum situational awareness. These programs will deliver a variety of compute and bandwidth intensive technologies, increasing the use of big data analytics, artificial intelligence/machine learning, and video for example, using common technical standards, APIs and data formats to deliver the command and control information that warfighters need to coordinate their activities. The software needed to run these new capabilities is increasingly being developed to rely on the cloud, which itself might reside in a variety of data centers, ranging from large commercial services, such as Amazon Web Services (AWS) GovCloud and Microsoft Azure Government, to the
The U.S. Food and Drug Administration’s (FDA) multifaceted responsibilities require continuous monitoring of trends in science and technology for the advancement of public health. In early 2020, the agency saw investigational new drug (IND) applications skyrocket to 3,806 — a significant increase compared to the previous year when they received only 166 applications during the same months.1
In order to achieve and maintain warfighting overmatch, coordinate deployed forces, and enable new capabilities, the US Army, Air Force, and Navy are actively looking to new programs such as Joint All Domain Command and Control (JADC2) to ensure warfighters have maximum situational awareness. These programs will deliver a variety of compute and bandwidth intensive technologies, increasing the use of big data analytics, artificial intelligence/machine learning, and video for example, using common technical standards, APIs and data formats to deliver the command and control information that warfighters need to coordinate their activities
With the rapid development of emerging technologies, such as cloud computing, big data, Internet of Things, artificial intelligence, fifth-generation mobile network (5G) technology, the construction of intelligent expressway systems has become a trend. Through the analysis of the intelligent expressway business and function, the overall architecture, technical architecture, and data architecture of this system are designed and presented in this paper. The design of the overall architecture describes the basis of intelligent expressway design and implementation. The technical architecture is the foundation of the overall architecture, making it with more complete functions, high security, and high feasibility. The data architecture helps to achieve effective data coordination through data standard management, resource monitoring, catalog management, metadata management, quality management, and security management. Accelerating the deep integration of modern technologies and expressway
Significance of CAE simulation thus is increasing because of its ability to predict the failure faster, also lot of design combinations can be evaluated with this before physical testing. Frame stiffness of side doors is one of the major criteria of a vehicle closure system. In most cases, designers around the globe will be designing same or very similar side door frame structures recurrently. In addition, in the current growing trend having an optimized side door frame design in quick time is very challenging. In this investigation, a new artificial intelligence (AI) approach was demonstrated to design and optimize frame reinforcement based on machine learning, which has been successful in many fields owing to its ability to process big data, can be used in structural design and optimization. This deep learning-based model is able to achieve accurate predictions of nonlinear structure-parameters relationships using deep neural networks. The optimized designs with optimization
Historical driver behavior and drive style are crucial inputs in addition to V2X connectivity data to predict future events as well as fuel consumption of the vehicle on a trip. A trip is a combination of different maneuvers a driver executes to navigate a route and interact with his/her environment including traffic, geography, topography, and weather. This study leverages big data analytics on real-world customer driving data to develop analytical modeling methodologies and algorithms to extract maneuver-based driving characteristics and generate a corresponding maneuver distribution. The distributions are further segmented by additional categories such as customer group and type of vehicle. These maneuver distributions are used to build an aggressivity distribution database which will serve as the parameter basis for further analysis with traffic simulation models. The database will also be leveraged to investigate and predict the performance of the vehicle on a trip and driver
Manufacturing facilities generate massive amounts of operational data from their automated production equipment, condition monitoring devices, and other sensors and systems. As companies are becoming more aware of the potential marooned in these assets, they are asking how Industrial Internet of Things (IIoT) initiatives can help tap into this information and create useful insights. But many attempts to tackle this via enterprise-wide mega-projects fail to meet expectations because of the sheer size and complexity. Perhaps a better approach is to begin with “little” data at the source to build up to big data using edge computing, focused applications, and open connectivity
Car-share trajectory is the big data of time and space that contains the travel behavior of residents. It is of great significance for station planning to dig out residents’ travel hotspots from the Car-share track data. This paper uses a clustering algorithm based on grid density. The algorithm first divides the trajectory space into grid cells and sets the density threshold of the grid cells; then maps the trajectory points to the grid cells and extracts hot grid cells based on the density threshold; By merging reachable hotspot grid units, hotspot areas of cities are discovered. This paper analyzes the demand for residents’ travel in the hotspot area, and uses the random forest model to predict the demand, which can make a reference for the car-share company to launch cars and provide convenience for users to travel
Items per page:
50
1 – 50 of 153