This content is not included in your SAE MOBILUS subscription, or you are not logged in.
The Color Specification of Surrogate Roadside Objects for the Performance Evaluation of Roadway Departure Mitigation Systems
ISSN: 0148-7191, e-ISSN: 2688-3627
Published April 03, 2018 by SAE International in United States
This content contains downloadable datasetsAnnotation ability available
Roadway departure mitigation systems for helping to avoid and/or mitigate roadway departure collisions have been introduced by several vehicle manufactures in recent years. To support the development and performance evaluation of the roadway departure mitigation systems, a set of commonly seen roadside surrogate objects need to be developed. These objects include grass, curbs, metal guardrail, concrete divider, and traffic barrel/cones. This paper describes how to determine the representative color of these roadside surrogates. 24,762 locations with Google street view images were selected for the color determination of roadside objects. To mitigate the effect of the brightness to the color determination, the images not in good weather, not in bright daylight and under shade were manually eliminated. Then, the RGB values of the roadside objects in the remaining images were extracted. To obtain the representative color of the roadside objects, the K-means clustering algorithm was applied to find the color clusters of each type of roadside objects in the modified CIE LUV color space. The Silhouette index was applied to determine the optimal number of clusters. The clustered colors that cover the highest percentage of sampled locations were chosen as the representative colors of the roadside object type.
CitationYi, Q., Shen, D., Lin, J., Chien, S. et al., "The Color Specification of Surrogate Roadside Objects for the Performance Evaluation of Roadway Departure Mitigation Systems," SAE Technical Paper 2018-01-0506, 2018, https://doi.org/10.4271/2018-01-0506.
Data Sets - Support Documents
|[Unnamed Dataset 1]|
|[Unnamed Dataset 2]|
|[Unnamed Dataset 3]|
|[Unnamed Dataset 4]|
|[Unnamed Dataset 5]|
|[Unnamed Dataset 6]|
|[Unnamed Dataset 7]|
- National Highway Traffic Safety Administration, “Roadway Departure Safety,” https://safety.fhwa.dot.gov/roadway_dept/, October 2017.
- Sayer, J.R., LeBlanc, D.J., Mefford, M.L. and Devonshire, J., "Field Test Results of a Road Departure Crash Warning System: Driver Acceptance, Perceived Utility and Willingness to purchase," in Proceedings of the 4th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design: Driving Assessment, 2007.
- Katzourakis, D.I., de Winter, J.C., Alirezaei, M., Corno, M. and Happee, R., "Road-Departure Prevention in an Emergency Obstacle Avoidance Situation." IEEE Transactions on Systems, Man, and Cybernetics: Systems 44, no. 5 (2014): 621-629.
- Gray, A., Ali, M., Gao, Y., Hedrick, J., and Borrelli, F., “Semi-Autonomous Vehicle Control for Road Departure and Obstacle Avoidance,” IFAC Control of Transportation Systems 1-6, 2012.
- Álvarez, J. M., López, A. M., & Baldrich, R. "Shadow Resistant Road Segmentation from a Mobile Monocular System," in Iberian Conference on Pattern Recognition and Image Analysis, pp. 9-16. Springer, Berlin, Heidelberg, 2007.
- Bouwmans, T., El Baf, F., and Vachon, B., “Background Modeling Using Mixture of Gaussians for Foreground Detection-A Survey,” Recent Patents on Computer Science 1(3):219-237, 2008.
- Anguelov, D., Dulong, C., Filip, D., Frueh, C. et al., “Google Street View: Capturing the World at Street Level,” Computer 43(6):32-38, 2010.
- Xiao, J. and Quan, L. "Multiple View Semantic Segmentation for Street View Images," in Computer Vision, 2009 IEEE 12th International Conference on, pp. 686-693. IEEE, 2009.
- Salmen, J., Houben, S. and Schlipsing, M., "Google Street View Images Support the Development of Vision-Based Driver Assistance Systems," in Intelligent Vehicles Symposium (IV), 2012 IEEE, pp. 891-895. IEEE, 2012.
- Zamir, A.R. and Shah, M. "Accurate Image Localization Based on Google Maps Street View," in European Conference on Computer Vision, pp. 255-268. Springer, Berlin, Heidelberg, 2010.
- Berkhin, P., “A Survey of Clustering Data Mining Techniques,” Grouping Multidimensional Data 25:71, 2006.
- “K-means clustering,” https://en.wikipedia.org/wiki/K-means_clustering, October, 2017.
- Yi, Q., Chien, S., Fu, L., Li, L., Chen, Y., Sherony, R. and Takahashi, H., "Clothing Color of Surrogate Bicyclist for Pre-Collision System Evaluation," in Intelligent Vehicles Symposium (IV) , 2017 IEEE, pp. 304-309. IEEE, 2017.
- Li, C.H. and Yuen, P.C., “Regularized Color Clustering in Medical Image Database,” IEEE Transactions on Medical Imaging 19(11):1150-1155, 2000.
- Matkovic, K., "Tone Mapping Techniques and Color Image Difference in Global Illumination," PhD Diss., Matkovic, 1997.
- Dong, G. and Xie, M., “Color Clustering and Learning for Image Segmentation Based on Neural Networks,” IEEE Transactions on Neural Networks 16(4):925-936, 2005.
- “Determining The Optimal Number of Clusters: 3 Must Know Methods”, http://www.sthda.com/english/articles/29-cluster-validation-essentials/96-determining-the-optimal-number-of-clusters-3-must-know-methods/, October, 2017.
- de Amorim, R.C. and Hennig, C., "Recovering the Number of Clusters in Data Sets with Noise Features Using Feature Rescaling Factors." Information Sciences 324 (2015): 126-145.