Browse Topic: Big data
We present a novel processing approach to extract a ship traffic flow framework in order to cope with problems such as large volume, high noise levels and complexity spatio-temporal nature of AIS data. We preprocess AIS data using covariance matrix-based abnormal data filtering, develop improved Douglas-Peucker (DP) algorithm for multi-granularity trajectory compression, identify navigation hotspots and intersections using density-based spatial clustering and visualize chart overlays using Mercator projection. In experiments with AIS data from the Laotieshan waters in the Bohai Bay, we achieve compression rate up to 97% while maintaining a key trajectory feature retention error less than 0.15 nautical miles. We identify critical areas such as waterway intersections and generate traffic flow heatmap for maritime management, route planning, etc.
TOC
In-Use emission compliance regulations globally mandate that machines meet emission standards in the field, beyond dyno certification. For engine manufacturers, understanding emission compliance risks early is crucial for technology selection, calibration strategies, and validation routines. This study focuses on developing analytical and statistical methods for emission compliance risk assessment using Fleet Intelligence Data, which includes high-frequency telematics data from over 500K machines, reporting more than 1000 measures at 1Hz frequency. Traditional analytical methods are inadequate for handling such big data, necessitating advanced methods. We developed data pipelines to query measures from the Enterprise Data Lake (A Structured Data storage system), address big data challenges, and ensure data quality. Regulatory requirements were translated into software logic and applied to pre-processed data for emission compliance assessment. The resulting reports provide actionable
In view of the complexity of railway engineering structure, the systematicness of professional collaboration and the high reliability of operation safety, this paper studied the spatial-temporal information data organization model with all elements in whole domain for Shuozhou-Huanghua Railway from the aspect of Shuozhou-Huanghua Railway spatial-temporal information security. Taking the unique spatial-temporal benchmark as the main line, the paper associated different spatial-temporal information to form an efficient organization model of Shuozhou-Huanghua Railway spatial-temporal information with all elements in the whole domain, so as to implement the effective organization of massive spatial-temporal information in various specialties and fields of Shuozhou-Huanghua Railway; By using GIS (Geographic Information System) visualization technology, spatial analysis technology and big data real-time dynamic rendering technology, it was realized the real-time dynamic visualization display
The Auto industry has relied upon traditional testing methodologies for product development and Quality testing since its inception. As technology changed, it brought a shift in customer demand for better vehicles with the highest quality standards. With the advent of EVs, OEMs are looking to reduce the going-to-market time for their products to win the EV race. Traditional testing methodologies have relied upon data received from various stakeholders and based on the same tests are planned. The data used is highly subjective and lacks variety. OEMs across the world are betting big on telematics solutions by pushing more and more vehicles with telematics devices as standard fitment. The data from such vehicles which gets generated in high levels of volume, variety and velocity can aid in the new age of vehicle testing. This live data cannot be simply simulated in test environments. The device generates hundreds of signals, frequently in a fraction of seconds. Multiple such signals can
Using current technologies, a single “entry level” vehicle has millions of electrical signals sent through dozens of modules, sensors and actuators, and those signals can be sent over the air, creating a telemetry data that can be used for several ends. One electrical device is set up to have diagnosis, in order to make maintenance feasible and support repair, plus giving improvement directions for specialists on new developments and specifications, but in several cases the diagnosis can only determine the mechanism of failure, but not the event that triggered that failure. Current evaluation method involves teardown, testing and knowledge from the involved specialized team, but this implies in recovering of failed parts, which in larger automakers with thousands of dealers/repair shops, reduces the sample for analyses when there is a systemic issue with one component. This specificity is usual in Propulsions systems, regarding electro-mechanical devices, and sensors, also in
The automotive industry is going through one of its greatest restructuring, the migration from internal combustion engines to electric powered / internet connected vehicles. Adapting to a new consumer who is increasingly demanding and selective may be one of the greatest challenges of this generation, Original Equipment Manufacturers (OEM) have been struggling to keep offering a diversified variety of features to their customers while also maintaining its quality standards. The vehicles leave the factory with an embedded SIM Card and a telematics module, which is an electronic unit to enable communication between the car, data center. Connected vehicles generate tens of gigabytes of data per hour that have the potential to be transformed into valuable information for companies, especially regarding the behavior and desires of drivers. One of the techniques used to gather quality feedback from the customers is the NPS it consists of open questions focused on top-of-mind feedback. Here
The main purpose of this research is to identify how the established quality methodologies, known worldwide as TQC (Total Quality Control) and TQM (Total Quality Management) are supported by the tools of the Quality 4.0 concept that similarly received influence from the disruptive technologies of Industry 4.0 in the last decade. In order to crosscheck the relationship among TQC and TQM and how Quality 4.0 supports these quality systems a qualitative investigation method was adopted through a survey questionnaire applied to one of the most important worldwide automobile company, based also in Brazil, Toyota of Brazil. Based on a literature review and relationship of concepts and synergy among them it was possible analyse and find out conclusions of this research work. The main results were identified as TQC and TQM are very well established concepts of quality and Quality 4.0 concepts and tools have been implemented on a path according to the markets importance prioritization, so then
The Advancement in Connected vehicles Technology in recent years has propelled the use of concepts like the Internet of Things (IoT) and big data in the automotive industry. The progressive electrification of the powertrain has led to the integration of various sensors in the vehicle. The data generated by these sensors are continuously streamed through a telematics device on the vehicle. Data analytics of this data can lead to a variety of applications. Predictive maintenance is one such area where machine learning algorithms are applied to relevant data to predict failure. Field vehicle malfunction or breakdown is costly for manufacturers’ aftermarket services. In the case of commercial vehicles, downtime is the biggest concern for the customer. The use of predictive maintenance techniques can prevent many critical failures by tending to the root cause in the early stages of failure. Engine overheating is one such problem that transpires in diesel engines. Overheating of an engine
The global big data market had a revenue of $162.6 billion in 2021.1 Data is becoming more valuable to companies than gold. However, this data has been used, historically, without contributors’ informed consent and without them seeing a penny from the discoveries the data led to. This article discusses how non-fungible tokens (NFTs) can provide a helpful tool for pharmaceutical companies to track contributed data and compensate contributors accordingly. NFTs are unique, untradable cryptographic assets that can be tracked on a blockchain. NFTs provide a unique traceable token that cannot be replicated, providing a perfect tool to store biodata. The term biodata refers to details regarding a patient’s history and behavioral patterns.
Traditional methods of municipal domestic waste analysis and prediction lack precision, while most data’s sample size is not suitable for many neural networks. In this paper, combining the advantage of deep learning methods with the results of association analysis, a waste production prediction method TLSTM is proposed based on long short-term memory(LSTM). It is found that the most influencing factors are population, public cost, household and GDP. Meanwhile, the garbage production in Shanghai will continue to decline in the future, indicating the policy of refuse classification is effective. The R-square index and MSE index of the model were 0.55 and 76571.73 respectively, surpassing other state-of-the-art models. In cooperation with School of Environmental Science and Engineering at Shanghai Jiao Tong University, the dataset comes from the average data of the Shanghai Household Waste Management Regulation from 1980 to 2020. This research method has a certain guiding significance to
Kontron and Intel experts explain how rugged, modular COM Express solutions reduce complexity and allow retrofit of autonomous systems on heavy mobile equipment. Continually transformed with more than a century's advances in capabilities, hydraulics and fuel efficiency, today's heavy mobile equipment must also become more intelligent and better connected. Technologies such as artificial intelligence (AI), deep learning, big data, GPS, 5G and computer vision are proving their mettle - empowering far more efficient ways of carrying out unique and demanding tasks via advanced telematics, advanced driver assistance systems (ADAS) or varying levels of autonomy. Heavy mobile equipment (HME) that can gather and apply data in real time operates and makes decisions in ways that humans cannot. This evolution toward automation promises not only leadership for manufacturers of more advanced systems, but also increased safety, economy, efficiency and ecological compatibility.
In this study it will show, big data analysis and user survey of driving records were conducted to investigate frequency of use and ease of operation of the regen paddle to control one-pedal driving system in electric vehicle. According to 3.8 million driving record big data analysis result, it was found that the driver manipulates 3.31 times on average during a single trip, mainly during the early stages of driving. According to user observation research result in 41.8% of participants did not used or used less than 5 time of regen paddle during one single trip. Also 336 participants, which occupy 83%, responded that the regen paddle manipulation for one-pedal driving was inconvenient. In conclusion, because of the use frequency of the regen paddle is low and the operation of regen paddle is inconvenient. It seems necessary to change the design of the regen paddle.
After so many supply chain and logistics challenges since the start of the COVID-19 pandemic, we’ve seen numerous headlines proclaiming a new era of reshoring and the end of lean production and just-in-time manufacturing strategies.
In order to improve the energy efficiency of hybrid electric vehicles and to improve the effectiveness of energy management algorithms, it is very important to predict the future changes of traffic parameters based on traffic big data, so as to predict the future vehicle speed change and to reduce the friction brake. Under the framework of deep learning, this paper establishes a Long Short-Term Memory (LSTM) artificial neural network traffic flow parameter prediction model based on time series through keras library to predict the future state of traffic flow. The comparison experiment between Long Short-Term Memory (LSTM) artificial neural network model and Gate Recurrent Unit (GRU) model using US-101 data set shows that LSTM has higher accuracy in predicting traffic flow velocity.
Items per page:
50
1 – 50 of 168