Abstract

Research and technology in autonomous vehicles is beginning to become well recognized among computer scientists and engineers. Autonomous vehicles contain a combination of global positioning system (GPS), light detection and ranging (LIDAR), cameras, radio detection and ranging (RADAR), and ultrasonic sensors (which are hardly ever included). These autonomous vehicles use no less than two sensing modalities and usually have three or more. The goal of this research is to determine which sensor to use depending on the functionality of the autonomous vehicle and analyze the similarities and differences of sensor configurations (which may come from different industries too). This study summarizes sensors in four industries: personal vehicles, public transportation, smart farming, and logistics. In addition, the paper includes advantages and disadvantages of how each sensor configuration are helpful by taking into consideration the activity that has to be achieved in the autonomous vehicle. A table of results is incorporated to organize most of the sensors’ availability in the market and their advantages and disadvantages. After comparing each sensor configuration, recommendations are going to be proposed for different scenarios in which some types of sensors will be more useful than others.

1 Introduction

Autonomous vehicles are getting noticed in the technology industry. More companies are designing self-driven cars and/or other vehicles/products to facilitate certain tasks for humans or even replace interaction with such activity. These vehicles are being used for personal use, transportation, automation processes, and production in the farming industry.

Technology companies are engaged in automated vehicles projects and work with specific types of sensors, which compose the model of these vehicles to interact with the environment and make the automation process possible. Each one of the technology companies strives to achieve the desired outcome for designing an autonomous vehicle, with a selection of what types of sensors and systems are needed.

Although there are numerous levels of automation ranging from one to four, examples include simple driving assistance to keep the vehicle in the lane or autopilot in other cars similar to the recently popular Tesla company. Several types of automatic technology are being used in various technologies, which need some assistance from software to increase productivity. Some companies are constantly innovating to create better autonomous vehicles that could be at some point fully automatic without any aid from a driver. In order to obtain the desired results, autonomous devices and vehicles need to contain specific systems and sensors that will adapt to the requirements of that specific industry.

In this paper, we will review and compare the different sensors used in this type of technology and assign each one of them to specific companies that use such devices in their autonomous projects. We will create a report to easily visualize patterns of which types of sensors are used in unique settings, focusing on reasons why these types of sensors are better for that particular job. In addition, if the devices present some limitations, there will be a study explaining the disadvantages that could be assigned to it. We are going to focus on five categories where sensors can be used in automated vehicles, which include Types of Sensors, Levels of Autonomy, Personal Vehicles, Public Transportation, and Smart Farming.

We will talk about personal vehicles, which is a very attractive area for companies to develop their projects. Innovation is constantly taking place in this area to provide a fully autonomous vehicle in the near future. Public transportation is also experimenting with a significant impact with autonomous technology, and it is important to analyze how companies are trying to adapt the systems to this area. Then, we will review the farming industry, where many farming vehicles and devices are now using automation to boost production and minimize human interaction. Then, we will focus on the medical field analyzing how ambulances are implementing these devices. And finally, we will study unmanned aerial vehicles (UAVs ) which are used for flight and navigation purposes.

2 Related Work

As artificial intelligence started becoming more present in the industry, developers, and researchers started thinking about automation for vehicles. In order to build such projects, there was an extensive process of research for sensors and which types perform better in different situations. Sensors are in charge of most of the processes for automation in a vehicle, and in order to successfully perform, there must be a link, or what researchers call a fusion, between the different sensors to create a fully automated automobile.

2.1 Sensor Analysis for Safety Basis.

The first condition for autonomous car build-out is to guarantee the safety of passengers, pedestrians, and the environment itself. There are several studies on sensor performance involving safety protocols, which involve object detection, weather condition scenarios, lane support systems, and assistance systems that provide safety to the driver. The different types of sensors used include Radio Detection and Ranging (RADAR), Light Detection and Ranging (LIDAR), ultrasonic sensors, Global Positioning System (GPS), cameras, and vehicle communication. The results of such research summarize the advantages and disadvantages of the equivalent sensors and systems to conclude that an optimal self-driven vehicle must fusion those sensors [1].

The approach proposed focuses on the overall performance of these sensors and presents their advantages and disadvantages. It emphasizes comparison between different types of sensors and narrows the overall category of autonomous vehicles to subcategories or industries that operate this technology. This entails areas such as personal and public transportation and farming. Some of these industries recently developed their technologies, which will reform the study, as it was already considered outdated.

2.2 Sensor Technology.

Moreover, there is extensive research about how the different types of sensors work and how they are used together in order to make autonomous vehicles perform. However, it is necessary to combine the data of multiple sensors because there are no sensors capable of doing all the work on their own. Some of them specialize in providing visual information and others providing measures of distance between the vehicle and a pedestrian. Studies were done providing information about the called fusion of data provided by the numerous sensors and trying to optimize the system for autonomous vehicles [2].

Nevertheless, the different technologies used by specific companies are not provided. This survey will analyze the different projects available, sort the types of sensors used, and then compare which scenarios these sensors best perform in. With this, there is an organization about patterns or purposes regarding sensors and how they are used depending on the circumstances.

2.3 Sensor Deficiency Based on Weather Conditions.

Safety is one of the most important features in autonomous vehicles because human interaction with the vehicle is not present. The technology manipulated needs to adapt to any situation in order to guarantee the safety of a driver, pedestrian or the environment to avoid any kind of mishap. Studies have been done about problems that the sensors encounter when dealing with difficult weather conditions such as severe rain, snowstorms, and fog [3]. However, it is likely for sensors to experience difficulties dealing with environmental factors, and it is important to identify all the possible threats that could interfere with the execution of these devices to gather the required data.

Still, there is not an in-depth comparison between the features previously mentioned, sensors limitations, and differentiation among the technologies developed by the most advanced technology companies. It will be beneficial to have a strict comparison of what types of sensors are being used in each scenario and label them down into categories to accentuate not only the limitations of those but also the advantages and the reason of their usage for that particular case setting.

3 Study

3.1 Types of Sensors.

For autonomous vehicles being able to function properly, they need some sensors that are grouped into three main categories: RADAR, LIDAR, and cameras. It is important to explain how these sensors work and why developers are using them to perform these difficult tasks. In addition, GPS/GNSS technology is used as a position sensor that is relevant for these types of vehicles. Other relevant sensors such as inertial measurement units (IMU’s) and odometry sensors are going to be reviewed

  1. RADAR: Radio Detection and Ranging (RADAR) sensors send radio waves to gather the distance and speed of objects related to the vehicle. They are used for obstacle detection in autonomous vehicles, and their main function is to detect the objects in the environment and around the car. They are designed to identify any type of vehicles such as cars, motorcycles, and trucks. These sensors provide an environmental perception that it is useful by getting data about the surroundings and then providing it to the driver and other internal systems of the vehicle. The data collected help to provide parking assistance, blindspot detection, and rear collision warnings. RADAR sensors have the advantage of being efficient even with fog, rain, or snow, which is solving the limitation of other visual sensors. As mentioned in Ref. [4], RADAR sensors are getting cheaper and it is easier to implement those in common cars. However, there are not enough if we are trying to develop a high-end fully autonomous car.

  2. LIDAR: Light Detection and Ranging (LIDAR) sensors are very similar to RADAR sensors, but use lasers to map the environment rather than radio waves. According to the study in Ref. [5], LIDAR sensors can detect objects 200–300 m away and emit short pulses to measure more than a million points per second. The advantage of LIDAR sensors compared to RADAR sensors is that they allow to create three-dimensional (3D) maps of the environment to visualize the objects around the environment. They assist to cover a 360-deg view and are used to complement RADAR sensors and give more sophisticated tools and data with 3D mapping. For autonomous vehicles, it is crucial to have a clear sight of what is happening around the vehicle and make it capable of “seeing.” In fact, years ago, it was an expensive technology, but nowadays the costs are getting lower for being used in the vehicle mass market.

  3. Cameras: Thus, autonomous vehicles need to have some visual data and information in order to respond to events such as reading the traffic lights and interpreting their colors to stop or move. Usually, autonomous vehicles are designed with cameras in different angles to give a 360-deg view around the vehicle. These cameras can easily recognize cars and pedestrians. However, cameras have many downsides because the data they receive from outside depends on a clear sight to get the appropriate visual information. Bad weather conditions could affect the view of these cameras, where rain, fog, and snow could be potential threats. Cameras depend on illumination, and darkness can affect performance which could lead to an accident if a pedestrian or a car are not identified. Some researchers have proposed solutions to improve visibility in these devices and minimize side effects that could be affecting safety [6].

  4. Ultrasonic Sensors: Lastly, ultrasonic sensors are becoming more popular in the autonomous driving community as a cheap alternative to LIDAR sensors to incorporate obstacle detection systems and algorithms inside a self-driven vehicle. These sensors send ultrasonic impulses that are then reflected by the obstacle, and the signal is processed by the internal system of the vehicle to show the data of the obstacle perform accordingly. They are categorized in the type of range that could reach 5.5 m, but are designed for specific tasks that involve obstacle detection and assistance in a short range such as implementing a parking assistance system and detecting objects near the car. There are some other limitations concerning the short field of view and the difficulty for the sensor to detect objects going at a fast speed. Additionally, they are vulnerable to jamming and spoofing attacks leaving the sensor physically unable to function and creating false positives, which could lead to a potential incident without the user supervision [7].

  5. Position Sensors (GPS/GNSS): Global Positioning System (GPS), it is frequently used by autonomous vehicles to help them with navigation and give them positioning data. These systems use data that are received from several GPS satellites. With the current navigation technology, vehicles are incorporating these systems to create routes, analyze traffic, and possess information about the position of the vehicle in real time. This is then connected with the three major sensors for autonomous vehicles in order to create an innovative navigation experience without human interaction [8]. GPS connects with the Global Navigational Satellite System (GNSS) which is giving more data to the vehicle that guarantees the actual position. This technology is mostly used for ground vehicles that are constantly on the road surrounded by other cars and pedestrians.

  6. IMU Sensors: IMU is a specific type of sensor that is used to measure angular rates, force, and sometimes magnetic fields. This sensor is used most of the time as a backup sensor that needs to function properly in case of any accident or error that can cause the other sensors to stop working. It also plays an important role in sensor fusion because it is related on the other sensors to provide data about acceleration, this is crucial to protect the drive at all times and the vehicle to be designed to respond to any kind of harmful situation that could put the driver’s life in danger.

  7. Odometry Sensors: Odometry sensors use the data from motion sensors to monitor the change of position over time. This particular sensor is linked with position sensors but focuses on the change of position. It is commonly used in robotics to estimate the position change from the starting point to the endpoint. However, it is now incorporated in many autonomous vehicles to keep track of the position of the vehicle from a specific location.

3.2 Levels of Autonomy.

In order to identify a vehicle as autonomous, the Society of Automotive Engineers created five different levels to classify any intended autonomous vehicle or device based on the different technologies and differentiate minor automation such as parking assistance from more major autonomous features like driving capability without human interaction. It is important for the purpose of this research to mention and understand the different levels of automation to properly categorize the different technologies that are going to be analyzed extensively. It is easy to identify two different main categories containing the five different levels: regarding support for the driver and automated driving features. From level 0 to level 2, the driver is involved during all types of support systems and needs to be aware of everything at all times. The automated features are implemented to support and make the user more comfortable during the ride, but the act of driving is controlled by the user. Some examples of features that fall between these levels are parking assistance features, adaptive cruise control, blindspot warning, etc. On the other hand, between levels 3 and 5, vehicles start taking partial control where the user does not need to drive. Levels 3 and 4 are being developed currently by implementing characteristics such as traffic jamming, chauffeur, or driver-less taxi services in some smart cities where the car is fully autonomous. Level 5 is still far from the current technology because it requires the vehicle to perform effectively by itself in any given condition which is hard to achieve at the moment.

3.3 Personal Vehicles.

To initiate the study, we will first evaluate the category of personal vehicles. Currently, many technological companies are innovating to make personal autonomous vehicles better, while new methods and improvements are getting released on a daily basis. Most of the features in personal vehicles involving automation are comfort-related improvements, including adaptive cruise control, parking assistance, blindspot detection, lane centering, etc. But some companies are taking it one step further by developing autopilot systems and aspire to reach more capacity of automation aiming to a fully autonomous car. The sensors previously described are becoming more accessible to mass-market vehicles, giving auto firms a wide variety of alternatives to build cars with sensors that can perform effectively in multiple scenarios.

  1. Tesla Model X and Model S: Tesla autonomous vehicles are one of the most famous in the current market. The company recently announced the “Autopilot” mode integrated into the Model X and Model S. This gives the vehicle the capability to Auto Steer, Auto Park, Auto Lane change, and fully self-drive long distances on highways for a significant period of time without human assistance. Tesla autonomous cars are equipped with three types of sensors: cameras with a 360-deg range of vision, RADAR sensors providing long-range object detection, and 12 ultrasonic sensors used for car detection and responsible for a collision prevention system. Surprisingly to experts in the industry, Tesla does not use LIDAR sensors to develop their products. Instead, they use ultrasonic sensors, which are building the same measure function to get the distance between objects and the vehicle, but employing ultrasonic waves. However, the primary reason Tesla does not use LIDAR sensors is that these sensors can cost 75, 000 dollars per unit. The cost collides with Tesla’s purpose of developing mass-market autonomous vehicles. For instance, the CEO and founder of the company, Elon Musk, guaranteed to have achieved results nearly as adequate as LIDAR sensors using just cameras. As presented in Ref. [9], it is possible to represent the same type of images and 3D modeling implementing an image-based approach, which is significantly less expensive than LIDAR sensors. These types of approaches are being taken to reduce the cost of production and have made some developers come up with new ways of creating 3D modeling similar to the way that LIDAR sensors perform. However, the use of ultrasonic sensors instead of the common implementation of LIDAR sensors can be harmful for the execution of autonomous vehicles. In fact, a recent study [10] presented some limitations regarding ultrasonic sensors that can cause an autonomous vehicle to misbehave. The researchers conducted an experiment where they performed various attacks including random and adaptive spoofing attacks that could force the vehicle to move in an undesired moment and jamming attacks preventing proper detection capabilities leading to collision possibilities. Therefore, even with the advantages that can provide ultrasonic sensors with cost-related issues can be harmful to society and a threat to pedestrians and customers with the possibilities of collision due to the malfunction of the sensors.

  2. Google Waymo: Google also is getting involved in the autonomous vehicles industry. They created a company called Waymo in 2009, which will exclusively develop to produce an autonomous car that can guarantee safe mobility for their customers with the innovation of automation technology. At first, the company started using LIDAR sensors developed by Velodyne, which is a technology company that provides LIDAR technology to different self-driving car programs. However, later on Waymo started designing and producing their own LIDAR sensors because the cost was too high and is a handicap to introduce Waymo technology to a mass market. These sensors are called Honeycomb Laser Bear LIDAR, which has an unmatched design and provides a 95-deg field of view. This is helpful compared to the other sensors that have a 30-deg field of view because it gives more coverage and is able to detect objects at a closer range without having to put many sensors in the vehicle. It also has the capability of detecting objects at a very close range compared to other types of LIDAR and different sensors that have a significantly greater minimum range. In this case, the Honeycomb LIDAR has a minimum range of zero making the vehicle possible of avoiding objects that are really close to the car. The vehicle is also implementing RADAR systems for general obstacle detection such as pedestrians and vehicles to also detect their speed, cameras are placed to get visual information, as well as microphones to hear audio cues such as ambulance and police sirens. In addition, one of the biggest advantages that Waymo has over the different technologies is their data set, specifically for testing autonomous vehicle performance. Waymo owns one of the biggest free data sets concerning autonomous vehicles and provide information and data about five different LIDAR sensors that are used in their vehicles and being tested every day around the world. The perk of having the data set available to any researcher is that experts and enthusiasts can create their own tests to prove the limits of the sensors and cameras from Google to try to improve the technology. Some companies are expanding the data set even more to make it more efficient and use different types of sensors from cameras to RADARs and LIDARs [11,12]. Google and Waymo are currently developing one of the most efficient and advanced technologies for sensors to implement in autonomous vehicles. The Waymo One is one of the most famous auto-taxi services being tested in the United States to keep improving the function of the vehicle. Even though the company affirms to have significant amounts of data and testing for hazardous conditions such a rain, fog, and extreme climates; some locations cannot be tested with these vehicles due to lack of technology in the area. Smart cities are the main focus for these types of vehicles, in some other conditions the precision of the sensors will certainly not be the same, and this can lead to a lack of safety and performance for the vehicle.

  3. Baidu Apollo: Baidu, which is one of the largest internet and technology companies in the world has owned, a self-driving car project following the name of Apollo. Currently, they are deploying prototypes of autonomous cars, which are planned to be implemented in different Chinese cities as a Robo-Taxi. The Apollo vehicle is built with the Puck series of the newest and most efficient short- and long-range LIDAR sensors in the market provided by the Velodyne LIDAR company. These LIDAR sensors are the most advanced in the industry for autonomous vehicles and are used for many self-driving programs for 3D mapping and retrieving data from the environment to create collision avoidance systems and object detection for safety measures. However, they are still employing some other LIDAR sensors from the same company that could be more expensive but perform very well for object detection purposes. A recent study [13] compared these two sensors and how their resolution could influence detecting objects while the vehicle is driving. It concludes that the researchers worked on an imaging system for multi-object tracking using both high-resolution Velodyne LIDAR and a low-resolution sensor. With regard to the cheaper sensor, there is a precision of 69.3% using two different scenarios, vehicle and no vehicle. There is a difference for detecting objects within a significant range that cannot be seen in the system compared to the high-resolution Velodyne HDL-64 sensor, but the difference in cost is enough to start considering using these types of sensors more frequently and implementing them into different systems. Tests were done with Puck Velodyne sensors in different weather conditions [14]. Rain has a very limited influence on visibility for performance of the Velodyne LIDAR, but fog is somewhat important and it is crucial to consider this scenario. More power is needed when dealing with fog because it has a significant impact in terms of vision, using the 905 nm LIDAR pedestrians, reflectors, traffic signs are not visible at all or reaching the 50–70% visibility. Similarly, the company is using two different types of RADAR sensors in their Apollo vehicles. ARS408-21 is provided by Continental, which is implemented for developing Adaptive Cruise Control, Forward Collision Warning, and Emergency Brake Assist. In addition, they use the B01HC from Racobit for vehicle detection in mid-long range in different work environments. However, according to the previously mentioned study, RADAR sensors are not efficient for detecting non-metal objects in hazardous weather conditions, so LIDAR sensors are needed to provide more data and combine both to guarantee a high performance of the vehicle. With regard to cameras, they are currently using the MARS system which is an imaging solution for imaging applications. This system is also supported by two other cameras designed by Wissen Technologies and Leopard Imaging Inc.

  4. BMW: BMW car manufacturer has also invested in autonomous driving technology. Currently, the company provides limited self-driving features such as park assistance, adaptive cruise control, and some blindspot detection systems in many of their different series of luxury cars. In the future, the automobile giant is planning on deploying a level 3–5 autonomous project in 2021 called iNext series. In terms of hardware, they will implement solid-state LIDAR from the startup Innoviz, which is cheaper, faster, and provides higher resolution when imaging the environment around a vehicle. They have no spinning features compared to mechanical LIDARs, are placed in the front, back, and sides to then fuse the data, and a field of view is created that can compete with mechanical sensor’s performance at a low cost [15]. The startup confirms that the sensor is resilient to sunlight and adverse weather conditions and capable of delivering a clear 3D point cloud up to 250 m. This project is intended to be launched in 2021, and because of that, there are some research limitations due to the lack of current information publicly accessible. However, years ago, the BMW group tested and developed autonomous features of Levels 2 and 3 that are now in some way being implemented in their latest cars. Highly automated driving (HAD) systems are now being perfected after introducing some minor features such as traffic jam assistance and lane support. The vehicle being modified is a BMW series 5, which is implementing environmental perception sensors, cameras, and software features to link the autonomous actions. The system used to create the HAD is using 12 sensors, specifically, four laser scanners, three RADARs, four ultrasonic sensors, and a mono camera. A fusion of this sensor creates the ability to perceive objects at a close and long range in every direction to behave consequently depending on the situation. The process of creating 3D maps scanning the environment to have a visual representation is important. With the technology provided by sensor fusion and implementation of a system for HAD, 3D mapping is becoming possible to enhance the performance of autonomous features. However, this aspect can still be improved, as one of the biggest disadvantages is the lack of testing for this specific vehicle. There is a significant gap between research findings and the validation of a real product that will be customer-ready [16].

  5. Other Technologies: Some other companies are also very involved in the process of creating their own version of autonomous vehicles. Ford Motor is getting into the autonomous vehicle industry, partnered with Argo AI, a company responsible for setting up and developing the hardware and the software for autonomous vehicles. Their project is based on the Ford Escape Hybrid car, and they adapted Argo AI’s technology into their project to develop an autonomous car that could be deployed to a mass market. In addition, the vehicle manufacturer General Motors also has their own self-drive vehicle project partnering with a technology company called Cruise. They started with a car provided by General Motors and modified it with multiple sensors such as cameras, RADARs, and LIDARs in order to add autonomous features to it. Artificial intelligence technology, 3D and 2D mapping, and obstacle detection systems make the care as autonomous and secure as possible. However, the hardware and the specifications of the vehicles are not publicly available for researchers, making the previously mentioned companies have an important limitation in this survey. This is a handicap for the purpose of this paper because there is no information available, and it is not possible to give a clear analysis of the sensor technology that each company is using.

3.4 Public Transportation.

For public transportation purposes, the implementation of autonomous vehicles to enhance the development of this area in different countries could be revolutionary. Sometimes, it is an issue to develop good public transportation systems, but artificial intelligence can step in to improve and create smoother and safer systems for public transportation. Some researchers started to analyze potential projects in different cities and how different territories could adapt to this type of technology [17] while reviewing the economic implications of these projects and how the government could be considering such alternatives. In fact, some technology companies started implementing different self-driving projects in some cities to test these vehicles for public transportation, some of these are shuttles and autonomous buses [18].

  1. Autonomous Buses (Volvo): The biggest implementation for public transportation purposes using autonomous vehicles could be the bus. The implementation for automation could be problematic due to the size of a bus but some companies like Volvo started projects to be able to provide a fully autonomous bus. However, the company is still conducting tests in order to guarantee a working vehicle to be able to start developing it into the mass market. Some methods and systems have been progressing and adding them to some of the electric buses created by Volvo and provided to researchers to develop some systems that could then turn the bus into an autonomous vehicle. In Ref. [19], a variety of sensors is used to develop a fusion and give the bus the possibility to drive by itself. The research presented uses optical encoders installed in the wheels of the vehicle to collect velocity data which is then useful to stop the vehicle and regulate its velocity properly. Some proximity sensors are used to detect near objects and vehicles to give some data to the car that could be helpful for a break assistance system. LIDAR sensors are also being utilized for obstacle detection motives. The SICK LMS221 is a mid-range LIDAR that has a considerably lower price compared to the Velodyne LIDAR sensors already mentioned in this study. There are also some more recent experiments and trials supported by Ford Motors for developing a sustainable autonomous bus that could circulate in smart cities such as Europe or Asia. Some proposals were presented in Ref. [20] using a Volvo Electric Bus to include autonomous features with different types of sensors such as ultrasonic, LIDARs, RADARs, cameras, and optical sensors. The authors provided methods and systems focusing on just one type of sensor but as well with a sensor fusion approach. Each sensor excels in its specific domain, and they are all useful for some specific situations that are needed to be considered for autonomous vehicle development since safety and high performance are a requirement to guarantee the satisfaction and overall safety of the clients. However, it is important to mention that the experiment does not take into consideration of abnormal weather conditions or other situations rather than clear weather. It is crucial for this type of automotive technology to address such situations because drivers and passengers can encounter them frequently.

  2. Autonomous Shuttles (Navya): Unlike Volvo, some other startups and relatively smaller companies got into the business of redesigning the public transportation industry by implementing autonomous features to buses and shuttles. Navya is a technology company specialized in designing and building autonomous vehicles. They created a project of an autonomous shuttle for public transportation purposes in smart cities around the world. The Autonom Shuttle Evo has been designed to carry up to 15 passengers on public and private roads. It is intended to work autonomously at all times to facilitate transportation for passengers in different situations and places. They dispose of a cutting-edge technology with a data fusion of cameras, LIDAR sensors, GPS, and odometry to guarantee precision and safety for the users [21]. The vehicle is implementing 10 LIDAR sensors for 2D and 3D mapping purposes to guarantee a proper obstacle and vehicle detection and map the environment. The sensors employed are Velodyne VLP 360 deg (multi-layers), Valeo SCALA (multi-layers), SICK MRS (multi-layer), and SICK TiM (single layer). The combination of multiple LIDAR sensors from different brands is highly effective due to the cost variation and to gather different specifications that each one of these sensors is offering. The Velodyne VLP 360 deg would ideally be installed at the top of the vehicle, in front and back to cover a 360-deg benchmark and get all the data possible to build 3D maps. The other three types of sensors are placed in specific areas to cover an important function that will be used to implement features such as parking assistance, detection of objects in short and long range, etc. Cameras are then placed in the front and back of the vehicle to have a clear vision of the environment. Cameras are also used in order to ensure safety and monitor the shuttle from a distance and make sure that it is working properly. Wheel encoders are also used to check the speed of the vehicle and monitor it depending on the road, weather, and situation. Finally, a GNSS antenna is placed at the top of the vehicle that communicates with a GPS sensor to have the specific location of the shuttle every moment. The advantage for these types of shuttles is the technology and innovation that they are presenting for public transportation purposes that can then be scaled to bigger projects. Sensor fusion is effective, and the shuttle can work properly during several hours with enhanced safety with different methods of obstacle detection. However, the area used for these shuttles is very limited, and it is intended to be used in very controlled areas such as university campuses and some areas in smart cities. Since the vehicle is small, there is not enough room for more than 15 people and this is a problem if some company or government wants to start using this technology for big public transportation projects [22]. The vehicle is currently being tested in several cities as a transportation medium in campuses within a specific area.

  3. Autonomous Shuttles (EasyMile): EasyMile is a technology company that works on autonomous vehicle technology and solutions. Currently, they own an autonomous shuttle project that is being used in several cities across Europe, Asia, and the United States. The design of the vehicle is very similar to the Nayva shuttle presented in the previous section. It has a capacity of 15 passengers as a maximum, it can function in different adverse weather conditions such as rain, snow, and fog. The maximum speed is 45 km/h, but it has been electronically adjusted to only 25 km/h for safety purposes. In terms of sensors, there are some limitations for this company that are not beneficial for the purpose of the research. Specific hardware information is not available publicly, but details about the partnership with EasyMile and Velodyne are available. Velodyne partnered with the company to provide the latest LIDAR sensors to the project of the EZ10 autonomous shuttle. Compared to the Navya shuttle, the EZ10 is implementing RADAR sensors into their system to enhance safety and fusion of the data with the output from LIDAR sensors. This can increase the effectiveness of the shuttle by having more methods of obstacle detection that are not limited to only one type of sensor. This shuttle has different testing methods that are positive for researchers to analyze the performance of the vehicle in different situations. There are some standard scenarios for testing in places such as university campuses, industrial areas, private and public routes with a defined route. The company has several modes of transportation for the EZ10, those being Metro mode with specific routes and stop stations to drop off and pick up passengers. The bus mode has several routes and provides the possibility of stopping at the desired time requested by the passenger and an on-demand service that works similar to a taxi or an Uber, where the vehicle gets the ride request to go to a specific location specified by the passenger [23]. However, the negative aspects of the vehicle are similar to those presented with Navya, where the shuttle is functional but the capabilities are limited. Further research is necessary for future technology to adapt to bigger projects and develop a truly autonomous project of public transportation that is viable for different cities and urban areas.

  4. Autonomous Trains: Autonomous vehicles are becoming more prominent in the world, and more companies are interested in investing in these vehicles, such as train and rail companies. Public transportation needs innovation to increase effectiveness and transport the millions of passengers that are demanding these types of services. The current state of autonomous trains is based on a scale with levels of automation very similar to the ones mentioned at the beginning of the paper. The grade of automation for trains goes from levels 1 to 4. The train system is considered level 4 when the train can operate autonomously, is capable of accelerating, stopping, closing and opening doors, and detecting obstacles along the road. At the moment, the industry of autonomous trains is uncertain, most of the technology and systems to develop these types of trains are owned by the government of multiple countries in every continent. There are some trains functioning at a level 4 of automation, located principally in airports, metro systems and some larger-scale trains. The technology employed for these projects and the sensors used are not the base of this paper; most of these autonomous trains are located in closed areas or configured and built differently than with other types of autonomous vehicles. Nevertheless, some state-owned companies have presented future solutions for outdoor autonomous trains and rail systems that employ the sensors analyzed for this paper, such as LIDAR, RADAR, cameras, and some other types of sensors that will also be captivating to research. France has an important technological advancement about autonomous trains and rail systems since Europe is a territory where trains are significantly popular. SNCF (French National Railway Company) is a French, state-owned company that develops some of the railways and trains across the country. Expected for 2023, the company presented the project of a fully autonomous train that works outdoors intended to carry passengers and merchandizing. The prototype is including technology such as LIDARs, RADARs, infrared cameras, ultrasonic sensors, and GPS localization. According to the company and related research, autonomous trains will be adding many benefits for public transportation and the users: lower energy cost, less traffic and delay, increased safety with faster response time due to the automation and sensor fusion from the environment [24]. However, the projects and technical specifications about the sensors employed are not specified or provided publicly by the developers because they have governmental restrictions. This is the main limitation of this research in terms of autonomous trains and vehicles related to these. Moreover, some research papers and projects are getting noticed for insight about prototypes and technology that could be implemented to develop autonomous trains. The prototypes are focusing on autonomous trains with obstacle detection technology that will work in open rails and tracks to get advantages of sensor technology. German researchers have developed a system called autoBAHN to include autonomous features to trains in existing railroads. Within the architecture of the system, different sensors are being implemented for obstacle detection purposes, localization, and fast response of the vehicle. The plugin devices are LIDARs provided by Velodyne, RADAR sensors, infrared sensors, 2D cameras, and ultrasonic sensors. It is important to mention the utilization of a significant amount of sensors, for a bigger vehicle as a train bigger amounts of information are needed to guarantee safety and efficiency due to the dimensions compared to smaller vehicles [25].

3.5 Smart Farming.

The farming industry faced several changes in the last few years implementing some technological improvements to increase efficiency of farming. This led the industry to become faster at production and give more sustainable, reliable, and quality products. Moreover, the technology race has not stopped, and autonomous vehicles are now being implemented in the fields to enhance production and expand the possibilities to create more effectively. Researchers affirm that what is now called “Smart Farming” which involves autonomous vehicles technology, networks, institutions, and diversity in terms of farm products can become the key to develop sustainable agriculture [26]. For this study, there are three categories of autonomous vehicles used in the farming industry that will be analyzed. Unmanned Ground Vehicle (UGV’s) or autonomous tractors will be first analyzed by exploring the different technologies in the market and how they are enhancing farming. To develop more about this first category, tractors are considered as an essential type of equipment in farms because they are used for different types of tasks including transportation for employees, harvesting crops, planting, fertilizing, and cultivating. However, a workforce is needed to operate these tractors personally, which can potentially lead to more expenses and the possibility of miscalculations or an inefficient working system due to human mistakes. Autonomous tractors can have potential positives for these types of tasks; the purpose will not be the reduction of staff, but the increase of efficiency in the farm to maximize profit and enhance the production process by having technology implemented on the field.

Second, Unmanned Aerial Vehicle (UAV’s) are also commonly used for farming purposes along with UGV’s to combine both technologies to increase productivity and quality. Usually, they are capable of analyzing the environment according to the type of production that is being held. They help to optimize the production process, monitor crop growth by checking the quality of them, and get a big picture of the process by communicating with other types of autonomous vehicles or devices that are being used to produce. Some companies that were developing drones for other purposes such as personal or military started to implement drone technologies in the farming industry. This is a way in which this industry can develop itself at a fast pace and revolutionize farming methods moving forward [27].

Finally, the rise of modern robots to be used in farming was considered important for this research. Robots are being developed to increase crop productivity in the farming industry. They differ from bigger autonomous farming vehicles because robots are better to perform some repetitive tasks that are not intended for a tractor or any other larger vehicle controlled by humans. Robots are perfect for small fields production, and they are capable of doing more specific analysis and they can work uninterruptedly when other types of autonomous vehicles need to be supervised. There are specific tasks that can be done by these robots in a large variety of production types. Fungicides are robots that are specialized in combating diseases that can be spread into the plants. These autonomous robots will be then programmed to go through the field applying pesticides or other chemicals to combat these potential diseases. In addition, robots can also be helpful to recollect weeds, cut the crops, or gather a specific plant or vegetables. These devices are especially useful in order to automate the production process but also to reduce the amount of work when dealing with specific types of plants that are harder to extract from the ground than others. For example, mushroom production is often hard because of the location in which they grow, that is why mushroom farms are becoming highly redeveloped with high technology in order to boost and simplify the production [28].

  • (1) John Deere Tractors With AutoTrac Technology: John Deere is a company that is a pioneer in the farming industry providing the technology including tractors, UAV’s, sprayers, and many tools for agriculture processes. They took action by developing autonomous features that are now implemented in many of their tractors and agriculture hardware. The company developed an automated vehicle guidance system that drives a tractor of a defined path increasing precision and efficiency to perform several agricultural activities. This technology is called AutoTrac and is now being implemented in many John Deere products. In order to scan the territory and give all of the necessary details to set up the predefined path, AutoTrac uses a type of sensor called Receiver. The model used by John Deere is the StarFireTM 6000 which works with a satellite signal to provide guidance to the tractor and maintain a straight path no matter what weather conditions might affect its task. John Deere emphasizes precision while working on autonomous farming features, also keeping it cost-effective. There are many benefits to implement this type of technology while working in farming such as covering more acres in less time, less fuel consumption, and to operate faster and the experience is much less physically demanding for the employee [29]. The guidance system is also accompanied by an automatic steering wheel that makes sure that the tractor is going in the proper direction indicated by AutoTrac. Currently, the company has developed a fully autonomous tractor prototype called the Joker. This project, however, is not intended to be produced and sold in the mass market after around ten years because of the need to perfect three technology aspects: electrification, artificial intelligence, and automation. To support these perks, some studies evaluated the efficiency from the economic side of the technology to analyze how much cheaper will be for different companies and farmers to implement this autonomous feature in their tractors. According to researchers in Ref. [30], there are significant savings when using this guidance system, by reducing labor cost 14%, fuel consumption 14%, seed costs 6%, and fertilizer cost 6%. Reducing cost could also lead to an increase in profit and also motivation for employees and farmers to support these types of technologies and exponentially increase the productivity.

  • (2) Agricultural Drones (DJI): DJI is a Chinese technological company that is specialized in building drones for different types of tasks. They recently introduced a drone specialized for agriculture activities, more specifically in spraying tasks and daily maintenance. The Agras T16 is a compact and portable drone capable of carrying 16L in its spray tank to ensure maintenance to any type of production. In terms of sensor technology employed, the drone has different types of sensors to ensure safety and a proper functioning of localization, navigation, and identification of crops. The drone is equipped with dual IMUs and barometers to keep track of the environment to ensure the safety of the flight. For the navigation system, the Agras T16 uses GNSS localization to keep track of the drone at any time, and flying first person (FPV) cameras with spotlights are implemented to take advantage of a remote control function to monitor the performance of the drone. Additionally, a powerful RADAR system is also functioning to enhance navigation in several types of weather with rain or fog, since the radar frequency is not affected by these variables. The sensors also have the capability of creating a 3D digital map to have an overall view of the environment and adapt to any specific situation. Since the Agras T16 is a hexacopter, the drone has increased mobility and stability giving it the ability to scan the field easily and in that way take a full advantage of the autonomous vehicle sensor technology [31,32].

  • (3) Agricultural Robots (NAIO Technologies): NAIO Technologies is a company that produces a variety of agricultural robots for different farming activities such as weeding, hoeing, and harvesting to help farmers work on their products. The OZ weeding robot is a vehicle specialized for weed transportation and harvest assistance. It has several tools and types of usage to work the land for specific tasks, for example, to break up compacted soil, remove grass weed from the roots, remove weeds between rows, remove seedlings growing between rows, etc. In order for these tools to function, the robots are equipped with several sensors that make possible the recognition of the land and the weed, root, or seeds that are located on the ground. The precision hardware utilized are cameras and laser sensors (LIDAR) with a precision of 2 cm. In this case, when the focus of the research is in agriculture, the main limitation for a 100% accuracy for sensors to identify the field and the crops is that in nature there are some irregularities, specifically the land is not homogeneous. LIDAR sensors used by NAIO are useful to reduce the mentioned limitations by getting data and creating a 3D map of the environment to identify the type of plants. They are also combining data from some other sensors to identify the proper function of oil filters, oil levels, and temperature. The vision system scans the territory to then create a grey scale to differentiate the soil from the plants and have a clear vision for the user and the robot. There is an average deviation between the real and the observed crop from 6 to 223 mm [33]. Some demonstrations of the OZ weeding robot are being presented using the sensors mentioned, determine the best algorithms for crop recognition, and attempt to constantly improve the detection system of the robot [34]. Some advantages of these robots are related to the efficiency and precision that they have to work with in the farming industry. There is a better allocation of resources, and in the long term, there are significant reductions of general costs for the production itself. However, there can be some disadvantages such as expected failures of inconsistency for specific tasks. In addition, a lot of work is required to have an accuracy level of more than 90%; the algorithms and sensors could fail while analyzing and identifying the soil and crops. Finally, for really extended projects, an important amount of robots will be needed, which could result in a high cost. More advanced and efficient technologies should be considered for extended farming projects.

3.6 Logistics.

The logistics industry is growing more and more with the development of new technologies. Companies such as Amazon, FedEx, and United States Postal Service are innovating in ways to deliver products safely and effectively from the supplier directly to the customer. Online sales and e-commerce are internalized in any company that could sell any type of product to their customers. Logistics companies could benefit from advanced technology to be more efficient and be able to manage time and resources in a better way. Autonomous vehicles could be introduced in the industry to automate some of the processes regarding transportation and delivery to customers [35]. Some researchers have determined some of the outcomes and potential positive and negative consequences of implementing autonomous vehicles in the logistics industry. There are not many big projects of autonomous vehicles for this industry, but it is emerging quickly because newer and more advanced technology would help many companies to work more effectively [36].

  1. Mercedes-Benz Autonomous Truck: Mercedes-Benz is planning on deploying one of their biggest projects regarding autonomous vehicles in 2025. This project is also being developed with the truck company Daimler. They plan to introduce a self-driven truck implementing a driver assistance system that will be able to take control of the vehicle during long hours of driving on the highway. Since truck drivers are constantly facing high traffic density, the company is willing to minimize those constraints. The Highway Pilot System is a system that has been developed to guarantee safety and move the different autonomous trucks throughout the highways, regulating their speed limit. Currently, the prototype is being tested on public roads in Germany and the United States. In the future, the purpose of this technology is to connect several autonomous trucks with established vehicle-to-vehicle communication using a satellite connection [37]. In terms of sensors, the vehicle is equipped with the latest technology on RADAR sensors and cameras. It is important to mention that the prototype is not implementing LIDAR sensors for their autonomous system compared to most of the other technology companies that are implementing LIDARs for 3D mapping and environment recognition. However, the Autonomous Truck has Lateral RADARs on both sides with a 170-deg field of view (FOV) and a range of 60 m, a single short-range RADAR with a 130 deg FOV and a range of 70 m, a single Front Stereo Camera with a 45-deg FOV and a range of 100 m, and a full range radar placed at the front of the vehicle with an 18-deg FOV and maximum range of 250 m. All sensors get data and information about the environment such as pedestrians, traffic signals, and vehicles [38]. There are advantages related to the implementation of these sensors for autonomous trucks. Since this specific model is a semi-autonomous truck intended to assist the driver during the trip, less sensors are needed. This prototype does not implement LIDAR sensors, which are commonly used for this type of autonomous technology. Since there is only a need to monitor close surroundings mostly when driving on highways, RADAR sensors are capable of retrieving all data necessary for the vehicle to function at a close distance. RADARs are also capable of giving high accuracy for poor weather conditions [39]; however, it is important to notice that the lack LIDAR sensors can also impact the performance of the vehicle. LIDARs could provide a more clear obstacle detection with 3D mapping and a larger FOV than RADARs. Similarly, RADARs usually do not provide a good classification of objects, and due to the low resolution of the images, it could be harder for the autonomous truck to scan the environment without a LIDAR [40].

  2. Freightliner Inspiration Truck: The Freightliner Inspiration Truck is one of the other autonomous truck prototypes expected to be deployed commercially in 2025, along with the Mercedes-Benz autonomous truck. This company, however, was the first to introduce and get approval to test a semi-autonomous truck on a national highway in the United States. Sensors for this autonomous truck are similar to the ones encountered in the Mercedes-Benz truck previously explained. The Freightliner Truck has a front bumper with a RADAR scanning system with short- and long-range RADARs. Long-range RADAR has a range of 820 feet and a 18-deg FOV, and the short-range RADAR has a range of 230 feet and a 130-deg FOV. In addition, there is a stereo camera located at the back of the truck with a range of 328 feet with a 45-deg FOV horizontal by 27-deg FOV vertical. These sensors communicate with the Highway Pilot system which with the given data executes lane stability, collision avoidance, speed, control, braking, and automatic steering [41].

4 Results

This study carefully analyzes sensor configuration for different types of vehicles with various functionalities that are used among several industries. In Table 1, there is an organized structure to compare sensor configurations and provide advantages and disadvantages. Sensors are classified among vehicle types, industries, and levels of automation. This table shows, indeed, the main purpose of this paper is to find specific sensor configurations that will perform better than others in multiple changing scenarios. With that being said, after evaluating advantages and disadvantages for sensors configurations, a set of recommendations is proposed for specific vehicle types and functionality reasons. Table 2 is presenting several sensor configurations that could perform significantly better than other solutions in different scenarios performing multiple tasks. Personal vehicles are the most demanding in terms of technology, and these vehicles should take advantage of the LIDAR technology for highly advanced autonomous cars. This is intended to give precision and increase safety for passengers. Buses, shuttles, and, trucks can benefit from similar sensor configurations with cameras and sensors that have a high FOV to anticipate accidents due to the difficulty of stopping bigger vehicles. Drones are small vehicles that would not require too many sensors. Cameras, RADARs, and IMUs are fundamental to monitor weather changes and for object detection. Drones can be customized for different tasks. Finally, robots are not demanding in terms of sensors, RADARs and cameras can work perfectly with several types of tasks.

Table 1

Comparison of autonomous vehicles technologies

IndustryCompanyCamera countLIDAR countRADAR countUltrasonic countRange of coverageLevel of automationAdvantagesDisadvantages
Personal vehiclesTesla [9,10]80112Cameras (360-deg FOV), RADAR 160-m of rangeSociety of Automotive Engineers (SAE) Level 2Lower costs for using ultrasonic sensors to detect objects. By having less autonomous features, safety monitoring is easier for Tesla, and ultrasonic sensors are vulnerable to jamming and spoofing attacksLimitation for ultrasonic sensors can misbehave or not function properly. Limiting possibilities using this hardware will help improve the project to a higher level of automation
Personal vehiclesGoogle Waymo [11,12]12560360-deg FOV and a range of 300 m. Cameras with a range of 500 mSAE Level 4Reduced costs by 90% by using unique LIDARs. Greater flexibility using the HoneyComb LIDAR with its features such as 95-deg FOV and minimum range of 0 open data set for testing purposes that increases probabilities of innovationLimited testing locations in extreme weather conditions. Smart cities as only target market. Modes of use for Waymo One are limited to Auto-Taxi service which is restricting advanced testing methods
Personal vehiclesBaidu Apollo [13,14]5235LIDAR with 360-deg FOV and covering 300 mSAE Level 4Variety of sensors creates diversity and takes advantage of different technologies. Different LIDAR for specific scenarios helps reducing costs. Good results according to the data for safety measuresSwitching to low-resolution LIDARs for price reasons can negatively affect performance. Puck LIDARs have low resolution and a visibility accuracy of 60%. Good LIDAR precision is needed because RADARs are not capable of detecting non-metallic objects in dangerous weather conditions
Personal VehiclesBMW Series 5 [15,16]2454N/ASAE Level 3Implementation of solid-state LIDARs that are cheaper and more effective than mechanical LIDARs. Precise 3D mapping and point cloud up to 250 m. Great sensor fusion between 12 sensorsTesting limitations for the prototype and adaptation to real-life. Gab between information available and the current state of the project. Limited mapping after reaching 250 m
Personal vehiclesArgo AI, Cruise9220LIDARs with a 360-deg FOV and a range up to 400 mN/ASpecific sensor information is not publicly available. Main limitation for the research and the main reason why there are not clear advantagesN/A
Public transports (buses)Volvo [19,20]6322LIDAR with 160-deg FOV and a range of 200 mSAE Level 2Different types of sensors are tested and implemented to gather the maximum information possible. For an autonomous vehicle with this size it is necessary to increase the number of sensorsPrototype has not been tested in different weather conditions that could have influence over the performance of the bus
Public transports (shuttles)Navya [21,22]21000Velodyne LIDARs with a 360- and 180-deg FOVSAE Level 4Technology employed can be scaled to develop bigger projects. Sensor fusion is accurate and effective with enhanced safety systems and obstacle detection methodsThe space of the vehicle is very limited (up to 15 passengers). Testing areas are limited and controlled (campuses, airports, smart cities). The technology has not been tested in bigger-scale vehicles. This could create a limitation when building a project with a bigger passenger capacity
Public Transports (Shuttles)EasyMile [23]3210LIDAR with 360-deg FOV and 250-m range. Front and lateral cameras with a range up to 300 mSAE Level 4A great variety of modes for its use (Metro mode, bus mode, on-demand mode). EZ10 project is implementing RADAR sensors for data fusion and safetySimilar disadvantages compared with Navya. The shuttle is functional, but the capabilities are limited. Not really an autonomous solution to a big scale project of public transportation
Public Transports (Trains)SNCF [24,25]2442LIDARs with 360-deg FOV, RADAR with a range of 140 mGoA 4Increased safety. Reduced energy cost. Faster response and less delay due to autonomous featuresGovernment restrictions are limiting public research outside the company. There is not yet an existing outdoors autonomous train, not until 2022 or 2023
Smart Farming (tractors)John Deere [29,30]3000Cameras that cover a 250 rangeSAE Level 2Capability of automatizing the farming and recollection process. Reduced cost of labor by 14%, fuel consumption by 14%, seed costs by 6%, and fertilizer costs by 6%More ambitious projects such as fully autonomous tractors are being developed and tested. They are intended to be produced in the mass market for ten more years
Smart Farming (Robots)NAIO Technologies [33,34]2200N/ASAE Level 4Long-term reduction of costs for an overall production. More precision and less human errors. Perfect tool for small or compact territoriesSlow performance for a large project. Need of several robots that can be highly priced. Miscalculations for some tasks or when applying pesticides
Smart Farming (Drones)DJI [31]2040RADAR with a 100-deg FOV. Cameras that cover a 200-m rangeSAE Level 4Design of the drone increases the ability for sensors to scan the environment and precisely identify better the field. RADAR based system is efficient because it is a controlled spaced where an expensive LIDAR is not requiredAccuracy and precision flaws could be encountered. The scanning feature could misbehave by not identifying the correct plants
LogisticsMercedez-Benz [37]5320Lateral RADARs with 130-deg FOV, short-range RADAR with 130-deg FOV, full range RADAR 250-m and 18-deg FOV, front stereo camera with 100-meters rangeSAE Level 4Implementation of a system using RADAR sensors for highway automated driving. Highway system connecting with the vehicle to ensure safety and provide data from sensorsVehicle does not implement LIDAR sensors that could be useful to increase visibility when doing a 3D mapping to visualize the environment
LogisticsFreightliner6420Long- and short-range RADARs (820 ft and 230 ft), Stereo Camera front and back with a 328 ft range and 45-deg FOVSAE Level 4Highway pilot system supporting data fusion and executing lane stability, collision avoidance, speed control, etc.Sensor information is missing to determine the negative aspects of the technology. This vehicle is currently a prototype
IndustryCompanyCamera countLIDAR countRADAR countUltrasonic countRange of coverageLevel of automationAdvantagesDisadvantages
Personal vehiclesTesla [9,10]80112Cameras (360-deg FOV), RADAR 160-m of rangeSociety of Automotive Engineers (SAE) Level 2Lower costs for using ultrasonic sensors to detect objects. By having less autonomous features, safety monitoring is easier for Tesla, and ultrasonic sensors are vulnerable to jamming and spoofing attacksLimitation for ultrasonic sensors can misbehave or not function properly. Limiting possibilities using this hardware will help improve the project to a higher level of automation
Personal vehiclesGoogle Waymo [11,12]12560360-deg FOV and a range of 300 m. Cameras with a range of 500 mSAE Level 4Reduced costs by 90% by using unique LIDARs. Greater flexibility using the HoneyComb LIDAR with its features such as 95-deg FOV and minimum range of 0 open data set for testing purposes that increases probabilities of innovationLimited testing locations in extreme weather conditions. Smart cities as only target market. Modes of use for Waymo One are limited to Auto-Taxi service which is restricting advanced testing methods
Personal vehiclesBaidu Apollo [13,14]5235LIDAR with 360-deg FOV and covering 300 mSAE Level 4Variety of sensors creates diversity and takes advantage of different technologies. Different LIDAR for specific scenarios helps reducing costs. Good results according to the data for safety measuresSwitching to low-resolution LIDARs for price reasons can negatively affect performance. Puck LIDARs have low resolution and a visibility accuracy of 60%. Good LIDAR precision is needed because RADARs are not capable of detecting non-metallic objects in dangerous weather conditions
Personal VehiclesBMW Series 5 [15,16]2454N/ASAE Level 3Implementation of solid-state LIDARs that are cheaper and more effective than mechanical LIDARs. Precise 3D mapping and point cloud up to 250 m. Great sensor fusion between 12 sensorsTesting limitations for the prototype and adaptation to real-life. Gab between information available and the current state of the project. Limited mapping after reaching 250 m
Personal vehiclesArgo AI, Cruise9220LIDARs with a 360-deg FOV and a range up to 400 mN/ASpecific sensor information is not publicly available. Main limitation for the research and the main reason why there are not clear advantagesN/A
Public transports (buses)Volvo [19,20]6322LIDAR with 160-deg FOV and a range of 200 mSAE Level 2Different types of sensors are tested and implemented to gather the maximum information possible. For an autonomous vehicle with this size it is necessary to increase the number of sensorsPrototype has not been tested in different weather conditions that could have influence over the performance of the bus
Public transports (shuttles)Navya [21,22]21000Velodyne LIDARs with a 360- and 180-deg FOVSAE Level 4Technology employed can be scaled to develop bigger projects. Sensor fusion is accurate and effective with enhanced safety systems and obstacle detection methodsThe space of the vehicle is very limited (up to 15 passengers). Testing areas are limited and controlled (campuses, airports, smart cities). The technology has not been tested in bigger-scale vehicles. This could create a limitation when building a project with a bigger passenger capacity
Public Transports (Shuttles)EasyMile [23]3210LIDAR with 360-deg FOV and 250-m range. Front and lateral cameras with a range up to 300 mSAE Level 4A great variety of modes for its use (Metro mode, bus mode, on-demand mode). EZ10 project is implementing RADAR sensors for data fusion and safetySimilar disadvantages compared with Navya. The shuttle is functional, but the capabilities are limited. Not really an autonomous solution to a big scale project of public transportation
Public Transports (Trains)SNCF [24,25]2442LIDARs with 360-deg FOV, RADAR with a range of 140 mGoA 4Increased safety. Reduced energy cost. Faster response and less delay due to autonomous featuresGovernment restrictions are limiting public research outside the company. There is not yet an existing outdoors autonomous train, not until 2022 or 2023
Smart Farming (tractors)John Deere [29,30]3000Cameras that cover a 250 rangeSAE Level 2Capability of automatizing the farming and recollection process. Reduced cost of labor by 14%, fuel consumption by 14%, seed costs by 6%, and fertilizer costs by 6%More ambitious projects such as fully autonomous tractors are being developed and tested. They are intended to be produced in the mass market for ten more years
Smart Farming (Robots)NAIO Technologies [33,34]2200N/ASAE Level 4Long-term reduction of costs for an overall production. More precision and less human errors. Perfect tool for small or compact territoriesSlow performance for a large project. Need of several robots that can be highly priced. Miscalculations for some tasks or when applying pesticides
Smart Farming (Drones)DJI [31]2040RADAR with a 100-deg FOV. Cameras that cover a 200-m rangeSAE Level 4Design of the drone increases the ability for sensors to scan the environment and precisely identify better the field. RADAR based system is efficient because it is a controlled spaced where an expensive LIDAR is not requiredAccuracy and precision flaws could be encountered. The scanning feature could misbehave by not identifying the correct plants
LogisticsMercedez-Benz [37]5320Lateral RADARs with 130-deg FOV, short-range RADAR with 130-deg FOV, full range RADAR 250-m and 18-deg FOV, front stereo camera with 100-meters rangeSAE Level 4Implementation of a system using RADAR sensors for highway automated driving. Highway system connecting with the vehicle to ensure safety and provide data from sensorsVehicle does not implement LIDAR sensors that could be useful to increase visibility when doing a 3D mapping to visualize the environment
LogisticsFreightliner6420Long- and short-range RADARs (820 ft and 230 ft), Stereo Camera front and back with a 328 ft range and 45-deg FOVSAE Level 4Highway pilot system supporting data fusion and executing lane stability, collision avoidance, speed control, etc.Sensor information is missing to determine the negative aspects of the technology. This vehicle is currently a prototype
Table 2

Recommended sensor configurations based on vehicle type and functionality

Type of vehicleSensors recommendedSensor configuration propertiesAdditional comments
Personal VehiclesUltrasonic sensors, Velodyne LIDARs, RADAR, more than 5 cameras, GPS, and GNSSWith this sensor configuration, the vehicle should be able to detect the entire surroundings. Due to the functionality of personal vehicles, there should be a 360-deg range of coverage for LIDARs, ultrasonic sensors, and camerasDepending on the level of automation for a specific vehicle, the produce could choose between an ultrasonic sensor or a LIDAR. Least autonomous vehicles could benefit from ultrasonic sensors due to their low cost. However, a more autonomous car should implement LIDARs to benefit from 3D mapping and precision
Shuttles/Buses/TrucksVelodyne LIDARs, RADARs, cameras with a large FOV, GPS, and GNSSThis configuration is similar to the one chosen for personal vehicles. In this case, cameras should have a large field of view due to the size increment of the vehicle. In addition, more sensors are required to ensure a proper sensor fusion in bigger vehiclesBuses and shuttles could consider implementing ultrasonic vehicle to reduce costs. These could fusion also with LIDARs for better results regarding object detection considering that most of these types of vehicles will have a basic level of automation
TrainsRADARs with at least 120 m of range, LIDARs, cameras, and ultrasonic sensorsFor trains, the sensor configuration can fluctuate. Trains will benefit with this configuration from advanced sensing technology with RADARs and LIDARs. These sensors should have a minimum amount of range of 120 m to increase response time in case of accidents. GPS and GNSS are encouraged but no necessary due to the control and monitoring of railroadsN/A
DronesRADARs, cameras with a large FOV, GNSS, GPS, IMUs, barometersThis specific configuration for drones will give the vehicle the ability to monitor its surroundings with cameras and LIDARs. GNSS and GPS sensors will help the drone with orientation. IMUs and barometers are crucial to register weather changesBasic object detection is needed for drones since they avoid close contact with the ground, there is no need of LIDARs for that purpose. In specific areas such as farming, attachments can be added to give the drone extra functions and increase its utility
RobotsRADARs, high-definition cameras, IMUsRobots are used in controlled spaces; this configuration will give the robot the ability to autonomously accomplish tasks with accuracyMost robots will need around 2–4 cameras since there is no need to have complete control of the surrounding area
Type of vehicleSensors recommendedSensor configuration propertiesAdditional comments
Personal VehiclesUltrasonic sensors, Velodyne LIDARs, RADAR, more than 5 cameras, GPS, and GNSSWith this sensor configuration, the vehicle should be able to detect the entire surroundings. Due to the functionality of personal vehicles, there should be a 360-deg range of coverage for LIDARs, ultrasonic sensors, and camerasDepending on the level of automation for a specific vehicle, the produce could choose between an ultrasonic sensor or a LIDAR. Least autonomous vehicles could benefit from ultrasonic sensors due to their low cost. However, a more autonomous car should implement LIDARs to benefit from 3D mapping and precision
Shuttles/Buses/TrucksVelodyne LIDARs, RADARs, cameras with a large FOV, GPS, and GNSSThis configuration is similar to the one chosen for personal vehicles. In this case, cameras should have a large field of view due to the size increment of the vehicle. In addition, more sensors are required to ensure a proper sensor fusion in bigger vehiclesBuses and shuttles could consider implementing ultrasonic vehicle to reduce costs. These could fusion also with LIDARs for better results regarding object detection considering that most of these types of vehicles will have a basic level of automation
TrainsRADARs with at least 120 m of range, LIDARs, cameras, and ultrasonic sensorsFor trains, the sensor configuration can fluctuate. Trains will benefit with this configuration from advanced sensing technology with RADARs and LIDARs. These sensors should have a minimum amount of range of 120 m to increase response time in case of accidents. GPS and GNSS are encouraged but no necessary due to the control and monitoring of railroadsN/A
DronesRADARs, cameras with a large FOV, GNSS, GPS, IMUs, barometersThis specific configuration for drones will give the vehicle the ability to monitor its surroundings with cameras and LIDARs. GNSS and GPS sensors will help the drone with orientation. IMUs and barometers are crucial to register weather changesBasic object detection is needed for drones since they avoid close contact with the ground, there is no need of LIDARs for that purpose. In specific areas such as farming, attachments can be added to give the drone extra functions and increase its utility
RobotsRADARs, high-definition cameras, IMUsRobots are used in controlled spaces; this configuration will give the robot the ability to autonomously accomplish tasks with accuracyMost robots will need around 2–4 cameras since there is no need to have complete control of the surrounding area

5 Conclusion

To conclude this paper, autonomous vehicles are getting developed and used more than ever. Companies are investing in this new technology to provide new products and incorporate these vehicles into their production system to increase productivity. It is important to acknowledge that not every sensor configuration will work for every type of usage or implementation. Asking this question would be the initial motivation for this study to be conducted. Along with this paper, there was a detailed analysis of case studies of current autonomous vehicles available on the market and prototypes under development. Divided by industry, the study gathered data about sensors from several companies among four industries (Personal Vehicles, Public Transportation, Smart Farming, and Logistics). A comparison was done in order to choose the best sensors based on vehicle type and functionality. Based on the advantages and disadvantages of each sensor configuration, this paper pretends to recommend general sensor configurations that will adapt properly to the properties of the vehicle and the usage when integrating them into the production process or to deploy them as products on the market.

6 Future Work

The future of autonomous vehicles is promising, research and development are intensifying each day, and computer scientists are eager to discover new ways to introduce this technology into people’s lifestyles. The topic presented for this research paper is incredibly extensive. Regarding the findings of this study, many technologies of autonomous vehicles have been deployed in the past months or years. However, many others are still in the development stage. This is limiting the research because there are no sensor data or information to analyze. Future researchers should specifically address the new trends in autonomous vehicles and keep updating the information when a new vehicle is deployed. Moreover, sensor technology has few options in the market for RADARs and LIDARs. Velodyne, for example, is one of the most famous manufacturers of LIDARs for autonomous vehicles.

Conflict of Interest

There are no conflicts of interest.

References

1.
Varghese
,
J. Z.
,
Boone
,
R. G.
,
2015
, “
Overview of Autonomous Vehicle Sensors and Systems
,”
International Conference on Operations Excellence and Service Engineering
,
Orlando FL
,
Sept. 10–11
, pp.
178
191
.
2.
Campbell
,
S.
,
O’Mahony
,
N.
,
Krpalcova
,
L.
,
Riordan
,
D.
,
Walsh
,
J.
,
Murphy
,
A.
, and
Ryan
,
C.
,
2018
, “
Sensor Technology in Autonomous Vehicles: A Review
,”
Proceedings of the 9th Irish Signals and Systems Conference (ISSC)
,
June 21–22
,
IEEE
, pp.
1
4
.
3.
Zang
,
S.
,
Ding
,
M.
,
Smith
,
D.
,
Tyler
,
P.
,
Rakotoarivelo
,
T.
, and
Kaafar
,
M. A.
,
2019
, “
The Impact of Adverse Weather Conditions on Autonomous Vehicles: How Rain, Snow, Fog, and Hail Affect the Performance of a Self-driving Car
,”
IEEE Veh. Technol. Mag.
,
14
(
2
), pp.
103
111
.
4.
Steinbaeck
,
J.
,
Steger
,
C.
,
Holweg
,
G.
, and
Druml
,
N.
,
2017
, “
Next Generation Radar Sensors in Automotive Sensor Fusion Systems
,”
2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF)
,
Bonn, Germany
,
Sept. 10–12
,
IEEE
, pp.
1
6
.
5.
Hecht
,
J.
,
2018
, “
Lidar for Self-driving Cars
,”
Opt. Photonics News
,
29
(
1
), pp.
26
33
.
6.
Maddern
,
W.
,
Stewart
,
A.
,
McManus
,
C.
,
Upcroft
,
B.
,
Churchill
,
W.
, and
Newman
,
P.
,
2014
, “
Illumination Invariant Imaging: Applications in Robust Vision-Based Localisation, Mapping and Classification for Autonomous Vehicles
,”
Proceedings of the Visual Place Recognition in Changing Environments Workshop, IEEE International Conference on Robotics and Automation (ICRA)
,
Hong Kong, China
,
May 31–June 7
, Vol. 2, p.
3
.
7.
Yan
,
C.
,
Xu
,
W.
, and
Liu
,
J.
,
2016
, “
Can You Trust Autonomous Vehicles: Contactless Attacks Against Sensors of Self-driving Vehicle
,”
Def Con
,
24
(
8
), p.
109
.
8.
Rahiman
,
W.
, and
Zainal
,
Z.
,
2013
, “
An Overview of Development GPS Navigation for Autonomous car
,”
Proceedings of IEEE 8th Conference on Industrial Electronics and Applications (ICIEA)
,
Melbourne, Australia
,
June 19–21
,
IEEE
, pp.
1112
1118
.
9.
Wang
,
Y.
,
Chao
,
W.-L.
,
Garg
,
D.
,
Hariharan
,
B.
,
Campbell
,
M.
, and
Weinberger
,
K. Q.
,
2019
, “
Pseudo-lidar From Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving
,”
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
,
Long Beach, CA
, pp.
8445
8453
.
10.
Xu
,
W.
,
Yan
,
C.
,
Jia
,
W.
,
Ji
,
X.
, and
Liu
,
J.
,
2018
, “
Analyzing and Enhancing the Security of Ultrasonic Sensors for Autonomous Vehicles
,”
IEEE Internet Things J.
,
5
(
6
), pp.
5015
5029
.
11.
Carballo
,
A.
,
Lambert
,
J.
,
Monrroy
,
A.
,
Wong
,
D.
,
Narksri
,
P.
,
Kitsukawa
,
Y.
,
Takeuchi
,
E.
,
Kato
,
S.
, and
Takeda
,
K.
,
2020
, “Libre: The Multiple 3d Lidar Dataset,”
arXiv preprint
. https://arxiv.org/abs/2003.06129
12.
Sun
,
P.
,
Kretzschmar
,
H.
,
Dotiwalla
,
X.
,
Chouard
,
A.
,
Patnaik
,
V.
,
Tsui
,
P.
,
Guo
,
J.
, et al
,
2020
, “
Scalability in Perception for Autonomous Driving: Waymo Open Dataset
,”
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition
,
Las Vegas, NV
,
June 23–26
, pp.
2446
2454
.
13.
del Pino
,
I.
,
Vaquero
,
V.
,
Masini
,
B.
,
Sola
,
J.
,
Moreno-Noguer
,
F.
,
Sanfeliu
,
A.
, and
Andrade-Cetto
,
J.
,
2017
, “
Low Resolution Lidar-Based Multi-Object Tracking for Driving Applications
,”
ROBOT'2017: Third Iberian Robotics Conference
,
Seville, Spain
,
Nov. 24
,
Springer
, pp.
287
298
.
14.
Kutila
,
M.
,
Pyykönen
,
P.
,
Holzhüter
,
H.
,
Colomb
,
M.
, and
Duthon
,
P.
,
2018
, “
Automotive Lidar Performance Verification in fog and Rain
,”
Proceedings of the 21st International Conference on Intelligent Transportation Systems (ITSC)
,
Maui, HI
,
Nov. 4–7
,
IEEE
, pp.
1695
1701
.
15.
Khader
,
M.
, and
Cherian
,
S.
,
2019
, “An Introduction to Automotive Lidar,” Accessed on: May, Vol. 15.
16.
Aeberhard
,
M.
,
Rauch
,
S.
,
Bahram
,
M.
,
Tanzmeister
,
G.
,
Thomas
,
J.
,
Pilat
,
Y.
,
Homm
,
F.
,
Huber
,
W.
, and
Kaempchen
,
N.
,
2015
, “
Experience, Results and Lessons Learned From Automated Driving on Germany’s Highways
,”
IEEE Intell. Transp. Syst. Mag.
,
7
(
1
), pp.
42
57
.
17.
Meyer
,
J.
,
Becker
,
H.
,
Bösch
,
P. M.
, and
Axhausen
,
K. W.
,
2017
, “
Autonomous Vehicles: The Next Jump in Accessibilities?
,”
Res. Transp. Econ.
,
62
, pp.
80
91
.
18.
Pessaro
,
B.
,
2016
, “Evaluation of Automated Vehicle Technology for Transit– 2016 Update,” Technical Report.
19.
Montes
,
H.
,
Salinas
,
C.
,
Fernández
,
R.
, and
Armada
,
M.
,
2017
, “
An Experimental Platform for Autonomous Bus Development
,”
Appl. Sci.
,
7
(
11
), p.
1131
.
20.
Kristiansson
,
K.
, and
Kvist
,
D.
,
2018
, “Determining Sensor Solution Enabling Future Autonomous Vehicles.”
21.
Iclodean
,
C.
,
Cordos
,
N.
, and
Varga
,
B. O.
,
2020
, “
Autonomous Shuttle bus for Public Transportation: A Review
,”
Energies
,
13
(
11
), p.
2917
.
22.
Christie
,
D. P.
,
Koymans
,
A.
,
Chanard
,
T.
,
Vollichard
,
P.
,
Lavadinho
,
S.
,
Vincent-Geslin
,
S.
,
Thémans
,
M.
, et al
,
2015
, “
City Automated Transport System (Cats):
The
Legacy of an Innovative European Project
,”
European Transport Conference
,
Frankfurt, Germany
,
Sept. 28–30
, No. CONF.
23.
Ganesh
,
S.
,
2020
, “Vehicle Component Configuration Design and Packaging in Virtual Environment for Autonomous Electric Buses.”
24.
Trentesaux
,
D.
,
Dahyot
,
R.
,
Ouedraogo
,
A.
,
Arenas
,
D.
,
Lefebvre
,
S.
,
Schön
,
W.
,
Lussier
,
B.
, and
Cheritel
,
H.
,
2018
, “
The Autonomous Train
,”
Proceedings of the 13th Annual Conference on System of Systems Engineering (SoSE)
,
Paris, France
,
June 19–22
,
IEEE
, pp.
514
520
.
25.
Gebauer
,
O.
,
Pree
,
W.
, and
Stadlmann
,
B.
,
2012
, “
Autonomously Driving Trains on Open Tracks—Concepts, System Architecture and Implementation Aspects
,” it-Information Technology Methoden und innovative Anwendungen der Informatik und Informationstechnik,
54
(
6
), pp.
266
279
.
26.
Walter
,
A.
,
Finger
,
R.
,
Huber
,
R.
, and
Buchmann
,
N.
,
2017
, “
Opinion: Smart Farming is Key to Developing Sustainable Agriculture
,”
Proc. Natl. Acad. Sci
,
114
(
24
), pp.
6148
6150
.
27.
Krishna
,
K. R.
,
2018
,
Agricultural Drones: A Peaceful Pursuit
,
Taylor & Francis
,
London
.
28.
Yaghoubi
,
S.
,
Akbarzadeh
,
N. A.
,
Bazargani
,
S. S.
,
Bazargani
,
S. S.
,
Bamizan
,
M.
, and
Asl
,
M. I.
,
2013
, “
Autonomous Robots for Agricultural Tasks and Farm Assignment and Future Trends in Agro Robots
,”
Int. J. Mech. Mechatron. Eng.
,
13
(
3
), pp.
1
6
.
29.
Payne
,
C.
,
2005
, “
Technologies for Efficient Farming
,”
Proceedings of the Electrical Insulation Conference and Electrical Manufacturing Expo, 2005
,
Indianapolis, IN
,
Oct. 23–26
,
IEEE
, pp.
435
441
.
30.
Lomonosov
,
D. A.
,
Shapar
,
M. S.
,
Redkokashin
,
A. A.
,
Borodin
,
I.
, and
Lomonosova
,
I.
,
2020
, “
Economic Efficiency of Autotrac Universal John Deere Automatic Driving Kit as an Element of Precision Farming System (ams) in Agricultural Enterprises of Primorsky Krai
,”
J. Crit. Rev.
,
7
(
13
), pp.
2835
2836
.
31.
Olson
,
D.
, and
Anderson
,
J.
, “
Review on Unmanned Aerial Vehicles, Remote Sensors, Imagery Processing, and Their Applications in Agriculture
,”
Agron. J.
32.
Carrier
,
A.
,
2020
, “Practical Applications of Drones in Natural Resources: Where We are and Where We’re Going.”
33.
Bakker
,
T.
,
van Asselt
,
K.
,
Bontsema
,
J.
,
Müller
,
J.
, and
van Straten
,
G.
,
2006
, “An Autonomous Weeding Robot for Organic Farming,”
Field and Service Robotics
,
Springer
, pp.
579
590
.
34.
Guyonneau
,
R.
,
Belin
,
E.
,
Mercier
,
F.
,
Ahmad
,
A.
, and
Malavazi
,
F.
, “Autonomous Robot for Weeding.”
35.
Van Meldert
,
B.
, and
De Boeck
,
L.
,
2016
, “Introducing Autonomous Vehicles in Logistics: a Review From a Broad Perspective,” FEB Research Report KBI 1618.
36.
Monios
,
J.
, and
Bergqvist
,
R.
,
2020
, “
Logistics and the Networked Society: A Conceptual Framework for Smart Network Business Models Using Electric Autonomous Vehicles (Eavs)
,”
Technol. Forecast. Soc. Change
,
151
, p.
119824
.
37.
Ballarin
,
C.
, and
Zeilinger
,
M.
,
2017
, “
The Truck of the Future: Autonomous and Connected Driving at Daimler Trucks
,” Technical Report, SAE Technical Paper.
38.
Nemchenko
,
A.
, and
Boyarskaya
,
A.
,
2016
, “Mercedes-benz Future Truck 2025.”
39.
Rasshofer
,
R. H.
, and
Gresser
,
K.
,
2005
, “
Automotive Radar and Lidar Systems for Next Generation Driver Assistance Functions
,”
Adv. Radio Sci.
,
3
(
B. 4
), pp.
205
209
.
40.
Leonard
,
J.
,
How
,
J.
,
Teller
,
S.
,
Berger
,
M.
,
Campbell
,
S.
,
Fiore
,
G.
,
Fletcher
,
L.
, et al
,
2008
, “
A Perception-Driven Autonomous Urban Vehicle
,”
J. Field Rob.
,
25
(
10
), pp.
727
774
.
41.
Kouchak
,
S. M.
, and
Gaffar
,
A.
,
2017
, “
Determinism in Future Cars: Why Autonomous Trucks are Easier to Design
,”
2017 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computed, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (Smart-World/SCALCOM/UIC/ATC/CBDCom/IOP/SCI)
,
San Francisco, CA
,
Aug. 4–8
,
IEEE
, pp.
1
6
.