Abstract
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for the operational performance evaluation of 3D imaging systems commonly referred to as terrestrial laser scanners (TLS). This standard defines the specifications and static test procedures to evaluate the 3D imaging and point-to-point distance measurement performance of TLS. In this work, we have assessed the 3D imaging and point-to-point distance estimation performance of a commercial micro-electro-mechanical system (MEMS)-based automotive LiDAR sensor and its simulation model according to the test procedures defined in this standard. The static tests were performed in a laboratory environment. In addition, a subset of static tests was also performed at the proving ground in natural environmental conditions to determine the 3D imaging and point-to-point distance measurement performance of the real LiDAR sensor. In addition, real scenarios and environmental conditions were replicated in the virtual environment of a commercial software to verify the LiDAR model’s working performance. The evaluation results show that the LiDAR sensor and its simulation model under analysis pass all the tests specified in the ASTM E3125-17 standard. This standard helps to understand whether sensor measurement errors are due to internal or external influences. We have also shown that the 3D imaging and point-to-point distance estimation performance of LiDAR sensors significantly impacts the working performance of the object recognition algorithm. That is why this standard can be beneficial in validating automotive real and virtual LiDAR sensors, at least in the early stage of development. Furthermore, the simulation and real measurements show good agreement on the point cloud and object recognition levels.
Keywords: micro-electro-mechanical systems, automotive LiDAR sensor, ASTM E3125-17 standard, advanced driver-assistance system, open simulation interface, functional mock-up interface, functional mock-up unit, point-to-point distance tests, user-selected tests, proving ground, PointPillars
1. Introduction
The advanced driving-assistance system (ADAS) increases vehicle and road safety. Car manufacturers are installing ADAS in modern cars to enhance driver safety and comfort, as shown in Figure 1. ADAS acquires the vehicle’s surrounding information from the environmental perception sensors [1]. Light detection and ranging (LiDAR), radio detection and ranging (RADAR), cameras, and ultrasonics are the current sensing technologies used for ADAS [2]. As these sensors support human drivers, their measurement performance must be verified before they are installed in the vehicles. However, the complexity of ADAS is also increasing over time. The validation of such a complex system in the real world is not feasible due to the cost and timing constraints [3].
Therefore, the automotive industry has started considering type approval based on virtual tests, requiring virtual ADAS and environmental perception sensors [4]. Consequently, the virtual environmental perception sensor’s operational performance also needs to be verified before using them for the virtual validation of ADAS.
LiDAR sensors have gained significant attention for ADAS over the last few years due to their wider field of view (FoV) and higher ranging accuracy compared to RADAR [6]. Therefore, car manufacturers install them for autonomous vehicle prototyping [7,8,9]. As a result, LiDAR sensors have become a crucial sensor technology for the ADAS perception system. Hence, their performance evaluation is necessary before using them for ADAS.
Several methods are described in the literature to characterize the performance of LiDAR sensors [10,11,12,13,14,15,16,17,18,19]. However, no established uniform test standard is currently available for the operational performance assessment of the real and virtual automotive LiDAR sensors. Although, this year, two projects [20,21] have been initiated to develop specifications and testing frameworks for the performance evaluation of automotive LiDAR sensors. However, these standards still need to be released. ASTM International released a detailed standardized documentary test procedure E3125-17 to evaluate medium-range laser scanners’ point-to-point distance measurement performance. A medium-range system can measure the distance within the range of 2 to 150 . According to this standard, the LiDAR manufacturer should specify the size, material, and optics characteristics of sphere and plate targets that can yield a minimum of 300 and 100 LiDAR points so that the dimensions or size of the targets can be easily estimated from the obtained LiDAR points. Furthermore, the LiDAR sensor-derived point-to-point distance measurement error should not exceed the maximum permissible error (MPE) specified by the manufacturer, which is 20 in this case. A derived point is obtained by processing a set of points from the target surface, representing the target’s center. This standard also helps to understand whether sensor measurement errors are due to internal or external influences. This test standard is recommended for spherical coordinate 3D imaging systems but can also be applied to non-spherical coordinate systems [22].
Therefore, in this paper, we have evaluated the 3D imaging and point-to-point distance measurement performance of the Blickfeld micro-electro-mechanical systems (MEMS)-based automotive LiDAR sensor and its simulation model (LiDAR FMU) according to the ASTM E3125-17 tests method. It should be noted that same authors have developed the LiDAR FMU model in their previous work [23] for the simulation-based testing of ADAS, and the presented paper is the continuity of it. The authors want to bring the scientific community’s attention, focusing on developing and validating automotive real and virtual LiDAR sensors to the ASTM E3125-17 standard. The test cases defined in this standard are easy to implement and can help to evaluate the measurement performance of automotive real and virtual LiDAR sensors, at least in the early stage of development. The first realization of this standard for the performance evaluation of terrestrial laser scanners (TLS) was reported in [24]. To the author’s knowledge, this is the first research paper evaluating real and virtual MEMS-based automotive LiDAR sensor operational performance according to ASTM E3125-17 standard test procedures.
The LiDAR sensor model is developed as a functional mock-up unit (FMU) by using the standardized open simulation interface (OSI) and functional mock-up interface (FMI) [23]. The model is packaged as an OSI sensor model packaging (OSMP) FMU [25]. Therefore, it was integrated successfully into the co-simulation environment of CarMaker from IPG Automotive. The virtual LiDAR sensor considers the accurate modeling of scan pattern and complete signal processing toolchain of the Cube 1 LiDAR sensor, as described in [23].
This paper is structured as follows: Section 2 describes the LiDAR sensor background. The specifications of the devices under test are given in Section 3. The ASTM E3125-17 standard test methods overview is discussed in Section 4, and the data analysis methodologies of the tests are described in Section 5. In addition, Section 6 provides a detailed description of the test setup and results. Finally, Section 7 and Section 8 provide the conclusion and outlook, respectively.
2. Background
LiDAR sensors can be classified as scanning and non-scanning LiDARs [26]. Flash LiDAR sensors are the non-scanning type of LiDAR. They can measure distances up to 50 and are suitable for ADAS safety applications, including blind-spot detection (BSD) and forward collision warning (FCW) [27]. On the other hand, scanning LiDAR sensors consist of optical phased array (OPA) scanners, mechanical rotating scanners, or MEMS scanners [26]. These sensors can detect objects up to 250 and can be used for the lane departure warning system (LDWS), adaptive cruise control (ACC), BSD, and FCW [27,28]. Specifically, MEMS LiDAR sensors are getting more attention for automotive applications because they are small, lightweight, and power efficient [29]. Furthermore, MEMS-based LiDAR sensors are also used for agricultural, archaeological surveys, and crowd analytics [26,30]. The following section will discuss the MEMS-based LiDAR sensors working principle in detail.
Working Principle of MEMS LiDAR Sensor
MEMS-based LiDAR consists of a laser and detector module (LDM) and beam deflection unit (MEMS mirrors), as shown in Figure 2. The laser source emits laser pulses, and the beam deflection unit deflects the beam in different directions to obtain holistic imaging of the environment. The photodetector receives the laser light partly reflected from the target’s surface. It measures the round trip delay time (RTDT) that laser light takes to hit an object and return to the detector. With the RTDT , the range R can be calculated as [23]:
(1) |
where the range is denoted by R, c is the speed of light, and is the RTDT, also known as the time of flight (ToF).
3. Devices Under Test
In the context of this work, we have evaluated the 3D imaging and point-to-point distance estimation performance of Blickfeld Cube 1 and its simulation model LiDAR FMU. Cube 1 was designed for industrial applications but is also used for automotive applications. Cube 1 comprises a single laser source and a beam deflection unit, so-called mirrors. The mirrors deflect the laser beam using two 1D MEMS mirrors oriented horizontally and vertically, having a phase difference of , and it generates the elliptical shape scan pattern illustrated in Figure 3 [32,33]. In addition, the specifications of the LiDAR sensor are listed in Table 1.
Table 1.
Parameters | Values |
---|---|
Typical application range | 1.5 m–75 |
Range resolution | <1 |
Range precision (bias-free RMS, 10 , 50% reflective target). The standard deviation of range precision is one , which means a coverage of 68.26% [34]. | <2 |
FoV (H × V) | × |
Horizontal resolution | – (user configurable) |
Vertical resolution | 5–400 scan lines per frame (user configurable) |
Frame rate | –50 (user configurable) |
Laser wavelength | 905 |
LiDAR FMU Model
This section will provide an overview of the toolchain and signal processing steps of the LiDAR FMU model. A detailed description of the LiDAR FMU modeling methodology can be found in [23]. The toolchain and signal processing steps of the LiDAR FMU model are shown in Figure 4. As mentioned in Section 1, the model is built as an OSMP FMU. It uses the virtual environment and ray tracing module of CarMaker.
The material properties of the simulated objects are specified in the material library of CarMaker. The test scenarios are generated in the virtual environment of CarMaker. After that, the FMU controller calls the LiDAR Simulation Library and passes the required input configuration via osi3::LidarSensorViewConfiguration to CarMaker. CarMaker verifies the input configuration and passes the ray tracing data via osi3::LidarSensorView::reflection to LiDAR FMU for further processing. The simulation controller is the main component of the LiDAR Simulation Library. It provides the interaction with the library, for instance, configuring the simulation pipeline, inserting ray tracing data, executing the simulation’s steps, and retrieving the results. The Link Budget Module calculates the photons over time . The Detector Module captures these photons’ arrivals and converts them into a photocurrent signal . In the next step, the Circuit Module amplifies the photocurrent signal of the detector to a voltage signal , which is processed by the Ranging Module. The Ranging Module determines the range and intensity of the target based on the received from the analog circuit for every reflected scan point. The Effect Engine (FX engine) includes a series of interfaces that interact with environmental or sensor-specific imperfections and simulation blocks [23]. The sensor model specifications are the same as Cube 1 given in Table 1, and it uses the same scan pattern as the real sensor shown in Figure 3.
4. ASTM E3125-17
This standard defines the specifications and test procedures to evaluate the measurement performance of medium-range 3D imaging systems that produce point clouds of an object of interest. Two types of tests are defined in this standard: two-face tests and point-to-point distance tests to evaluate the measurement performance of the device under test (DUT) [22]. Two-face tests are recommended for 3D-imaging systems that simultaneously measure the target in front and back modes. However, these tests are not applicable in this use case because Cube 1 can only measure in front mode. The point-to-point distance tests involve the measurements of target distances using the DUT and reference instrument (RI) in various orientations and positions within the DUT measurement capability. These tests also provide the impact of mechanical and optical misalignment within the DUT on the measured distance to the target [22].
4.1. Specification of Targets
This standard specifies the sphere and plate as targets for point-to-point distance tests. The sphere target is used for all point-to-point distance tests except relative range tests that use the plate target. According to the ASTM standard, the type of target materials, including their optical characteristics, should be specified by the DUT manufacturer because the material and optical properties influence the measurement. Users can choose any material’s targets if the DUT manufacturer does not specify them. Moreover, the sphere and plate target should be big enough to yield a minimum of 300 and 100 measured points after point selections so that the size or the dimensions of the targets can be estimated from the LiDAR point clouds. In this work, we use a diffuse reflective sphere target made up of plastic, as shown in Figure 5. The manufacturer has specified a diameter of the sphere target with an uncertainty of ± 1 . However, we have conducted several measurements in the lab and measured a diameter of the sphere target in the lab with a Hexagon Romer Absolute Arm 7525 SI [35] with a confidence score of 95%. In addition, we use the rectangular laser scanner checkerboard, as shown in Figure 6, for the relative range tests.
4.2. Inside Test
This test involves the distance measurement between two sphere targets placed equidistant to the DUT and whose centers are nominally collinear with the DUT origin, as shown in Figure 7. The center of the spheres should be aligned so that the azimuth angle of sphere A is from the DUT, then sphere B’s azimuth angle should be within . The sphere targets and sensor height shall be the same, so the elevation angle of spheres A and B should be within . The DUT should scan objects in both front-sight and back-sight modes. If the DUT can measure only in front-sight mode, the user can measure both targets in front-sight mode, but the sensor needs to be rotated to measure the back sphere [22].
4.3. Symmetric Test
This test involves the point-to-point distance measurement of sphere targets placed symmetrically to the DUT in different orientations and positions, as shown in Figure 8. These tests can be realized with the two azimuth angles ( and ) between the DUT and the center of line AB. This test involves eight measurements either in front-sight mode or in back-sight mode. The user can select any distances between the pairs of spheres and distances from the DUT that meets the angular sweep requirements mentioned in the standard, and those values should be within the work volume of the DUT. The angular sweep between the sphere targets shall be at least in the plane containing the target’s center and DUT origin [22].
4.4. Asymmetric Test
This test involves the point-to-point distance measurement of sphere targets placed asymmetrically concerning the DUT in different orientations and positions, as shown in Figure 9. This test is similar to the symmetric test, but the placement of the sphere targets is different. This test involves six measurements with the two orientations of DUT at azimuth angle and . The user can select any distances between the pairs of spheres and distances from the DUT that meet the angular sweep requirements mentioned in the standard, and those values should be within the work volume of the DUT. However, the angular sweep between the pairs of spheres target shall be at least .
4.5. Relative Range Test
The relative-range test involves the distance measurement of a plate target at a reference position and three different test positions (AB, AC, AD) in a front-sight mode, as shown in Figure 10. The center of the planar target and the origin of the DUT should be collinear. The reference distance of the DUT should be specified by the manufacturer. If this is not the case, then the user is allowed to choose any distance within the work volume of the DUT [22].
4.6. User-Selected Tests
The standard requires two additional user-selected tests to complete the point-to-point distance measurement tests. Again, the sphere or the planar object can be used as the target. We are using a DUT for automotive applications, which is why we use a vehicle and a 10% reflective Lambertian plate as targets for these tests. It should be noted that we measured the reflectivity of the real vehicle before using it for these tests. Furthermore, the measurements are performed at the Jtown proving ground [36] because we want to validate the measurement performance of the DUT in sunlight. The DUT was mounted on the test vehicle’s roof, as shown in Figure 11.
It should be noted that inside, symmetric, asymmetric, and relative range tests are performed in the lab to evaluate the 3D imaging and distance measurement performance of Cube 1 in the lab. The sensor origin and target center are collinear for these tests. But, the DUT was placed on the vehicle roof for user-selected tests, and the sensor origin is not collinear with the target center.
5. Data Analysis
The ASTM standard introduces a calculation method to find the coordinates of the derived points for sphere and plate targets. This process is repetitive; thus, Python and MATLAB-based programs for data analysis, filtration, and least square fit are used.
5.1. Calculation of Sphere Target Derived Points Coordinates
The ASTM E3125-17 standard has introduced a procedure to calculate the derived point for a sphere target, as shown in Figure 12.
Initial segmentation: The measured data corresponding to sphere targets shall be segmented from the surroundings since the DUT measures every object in its work volume [22]. The exemplary point clouds before and after the initial segmentation are shown in Figure 13. The points obtained after the initial segmentation are regarded as .
Initial estimation: The initial estimation is used to find the coordinate of the derived point, which is the center of the point set received from the surface of the sphere target [22]. Several methods are introduced in the standard for the initial estimation, including manual estimation, the software provided by the DUT manufacturer, and the closest point method [22]. In this work, we have used the closest point method to estimate the derived point, as shown in Figure 14. First, the Euclidean distances of all the LiDAR points in data set to the origin of DUT are calculated. is determined as the median of the M closest distances of points from the DUT origin, as shown in Figure 14a). Afterward, the distance is calculated by adding the half radius of the sphere target to , as illustrated in Figure 14b). The points within the radius are represented by [22].
-
Initial least squares sphere fit: A non-linear, orthogonal, least squares sphere fit (LSSF) is used on the points to determine the initial derived point . The general equation of the sphere can be expressed as follows [37]:
where x, y, and z are the cartesian coordinates of the points on the sphere’s surface, and R represent the center and radius of the sphere, respectively. Equation (2) is expanded and rearranged as [37](2) (3) To apply the least squares fit on all points obtained from the sphere surface, Equation (3) can be expressed in vector/matrix notation for all points in the data set as given in [37](4) (5) (6) (7) Here, the terms , , and represent the initial points of the data set, and , , and show the last points of the data set. Vector , matrix A, and vector contain the expanded terms of the sphere, Equation (3). The vector is the least squares fit method used to calculate the vector that contains the sphere’s center coordinates and radius R. We used the Python Numpy library [38] least squares function to calculate the vector that returns the sphere’s center coordinates and radius R. We can fit a sphere to our original data set by using the output of vector .
Cone cylinder method: As recommended in the standard, in the next step, we refine the derived point coordinates through the cone cylinder method for the sphere target, as shown in Figure 15. A straight line is drawn between the origin of DUT O and the initial derived point given in Figure 15a. A new point data set is generated from the initial segmented points , which lie within both cones shown in Figure 15b,c [22].
Second least squares sphere fit: Furthermore, an orthogonal non-linear LSSF is applied to the data set to find the updated derived point of the sphere target [22].
Calculation of residuals and standard deviation: Afterward, the residual and standard deviation of every point within is calculated. The residual is the difference between the sphere-updated derived point and the points in the set . In the next step, a new point set is defined, including the points whose absolute residual value is less than three times the standard deviation [22].
Third least squares sphere fit: On the new set , another LSSF is performed to find the updated derived point [22].
Calculation of final derived point coordinates: The final derived point is determined after at least four more times repeating the previous procedures on as recommended in the standard. The newly derived point of the prior task is regarded as in the subsequent iteration tasks [22]. The comparison between the sphere’s point cloud after initial and final LSSF is given in Figure 16.
Test Acceptance Criteria
The distance between the initial estimation and the final derived point shall be less than of the nominal radius of the sphere target, that is, × 0.2 = ; otherwise, another initial estimation should be conducted [22].
- According to the specifications of the DUT, the value of the distance MPE is equal to 20 . The distance error shall be less than 20 [22]. The distance error between the two derived points can be written as:
(8)
where the targets’ x, y, and z coordinates are denoted by the subscript t and the sensors’ by s [39].(9)
where the targets’ x, y, and z coordinates are denoted by the subscript t and the reference instrument by RI [39]. We have used Leica DISTO S910 as the RI for the real measurements, and OSI ground truth interface osi3::GroundTruth is used to retrieve the sensor’s origin and target center position in 3D coordinates in the virtual environment. [40,41].(10) In the case of the sphere target, the number of points in the data set shall be greater than 300 [22].
5.2. Calculating Coordinates of Derived Point for the Plate Target
The standard has introduced a procedure to calculate the derived point for a plate target. Figure 17 shows the steps to calculate the derived point for the plate target.
Initial data segmentation: The DUT provides point clouds from all the objects within its work volume. Therefore, all points received from the objects of no interest need to be filtered, as shown in Figure 18. After initial segmentation, the rest of the points are regarded as the points set [22].
Point selection for plane fit: Afterward, as required in the standard, the measured points from the edges of a rectangular plate are removed to fit a plane. This new point set is designated as [22].
Least squares plane fit: The least squares plane fit (LSPF) method is applied on the point set defined in [42] to determine the location and orientation of the plate target. In addition, the standard deviation s of residual q of the plane fit is measured at each position of the plate target, as required in the standard. The plane fit residuals q are the orthogonal distances of every measured point of the plate target to its respective plane [22].
- Second data segmentation: The points whose residuals q are greater than double the corresponding standard deviation s were eliminated to visualize the best plane fit, as suggested in the standard. The updated point set is regarded as . The number of points in should be more than 95% of all measured points from the plate target. The distance error and the root mean square (RMS) dispersion of the residuals q in are calculated using Equation (11) at the reference and each test position.
where q is the residual, and N denotes the number of points in the subset [22].(11) Derived point for plate target: Although the plate target has a fiducial mark, it was still challenging to determine a derived point precisely at the center of the plate target. Because of that, we use the 3D geometric center method on the point set to determine the derived point of the plate target, as recommended in the standard [22].
Test Acceptance Criteria
6. Tests Setup and Results
6.1. Inside Test
We have created a test setup in real and virtual environments, as shown in Figure 19, according to the specification given in Section 4.2. Cube 1 and the LiDAR FMU model use the same scan pattern with 500 scan lines and horizontal angle spacing, and the shape of the scan pattern is similar, as given in Figure 3. It should be noted that there is no restriction in the standard for the settings of the DUT, including FoV, the number of scan lines , and angular resolution .
Before executing the inside test, we measured the diameter of the simulated and real sphere targets from the received LiDAR points, and the results are tabulated in Table 2. The results of the inside test are given in Table 3. We used the osi3::GroundTruth to retrieve the ground truth data in the simulation environment. The 3D objects are rendered in Blender 3D software [43] and integrated into the virtual environment of CarMaker.
Table 2.
x (mm) | y (mm) | z (mm) | Reference Diameter (mm) |
Measured Diameter (mm) |
Diameter Error (mm) |
|
---|---|---|---|---|---|---|
Cube 1 | 1.3 | 6672.1 | 125.5 | 200.9 | 201.2 | 0.3 |
LiDAR FMU | 2.1 | 6672.4 | 121.2 | 200.0 | 200.1 | 0.1 |
Table 3.
Target | No. of Points (1) |
Reference Distance to Target (mm) |
Measured Distance (mm) |
Distance Error (mm) |
MPE (mm) |
Pass/Fail | |
---|---|---|---|---|---|---|---|
Cube 1 | Front sphere | 354 | 6680.0 | 6689.0 | 9.0 | 20.0 | Pass |
Cube 1 | Back sphere | 358 | 6680.0 | 6686.3 | 6.3 | 20.0 | Pass |
LiDAR FMU | Front sphere | 358 | 6680.0 | 6685.5 | 5.5 | 20.0 | Pass |
LiDAR FMU | Back sphere | 358 | 6680.0 | 6685.5 | 5.5 | 20.0 | Pass |
The diameter of the sphere from the real and simulated LiDAR points can be estimated correctly, and the diameter error is negligible, as given in Table 2.
6.2. Symmetric Test
We have created the test setup according to the specifications defined in Section 4.3. The real and simulated test setups of the symmetric test are shown in Figure 20.
The DUT measures the two-sphere targets simultaneously placed in different orientations in this test. It should be noted that there is no restriction in the standard of the distance between the two sphere targets. We have chosen a distance between the sphere target because DUTs have a low vertical FoV. The LiDAR FMU and Cube 1 use the same scan pattern as the inside test, but the horizontal angle spacing is . A total of eight measurements are taken in the front-sight mode, and all the tests are analyzed with an azimuth angle of between the DUT and the center line of the sphere targets because the second condition , as given in Section 4.3, is out of the DUTs work volume. The simulation and real results for the test positions A, B, C, and D are enumerated in Table 4.
Table 4.
Test Position |
Target | No. of Points (1) |
Reference Distance to Target (mm) |
Measured Distance (mm) |
Distance Error (mm) |
MPE (mm) |
Pass/Fail | |
---|---|---|---|---|---|---|---|---|
Cube 1 | A | Left sphere | 326 | 5050.0 | 5056.2 | 6.2 | 20.0 | Pass |
Cube 1 | A | Right sphere | 319 | 5050.0 | 5059.1 | 9.1 | 20.0 | Pass |
LiDAR FMU | A | Left sphere | 327 | 5050.0 | 5055.7 | 5.7 | 20.0 | Pass |
LiDAR FMU | A | Right sphere | 327 | 5050.0 | 5055.7 | 5.7 | 20.0 | Pass |
Cube 1 | B | Top sphere | 321 | 5050.0 | 5058.2 | 8.2 | 20.0 | Pass |
Cube 1 | B | Bottom sphere | 325 | 5050.0 | 5057.3 | 7.3 | 20.0 | Pass |
LiDAR FMU | B | Top sphere | 322 | 5050.0 | 5055.9 | 5.9 | 20.0 | Pass |
LiDAR FMU | B | Bottom sphere | 323 | 5050.0 | 5055.8 | 5.8 | 20.0 | Pass |
Cube 1 | C | Top sphere | 338 | 5050.0 | 5059.3 | 9.3 | 20.0 | Pass |
Cube 1 | C | Bottom sphere | 343 | 5050.0 | 5058.8 | 8.8 | 20.0 | Pass |
LiDAR FMU | C | Top sphere | 340 | 5050.0 | 5055.6 | 5.6 | 20.0 | Pass |
LiDAR FMU | C | Bottom sphere | 339 | 5050.0 | 5055.8 | 5.8 | 20.0 | Pass |
Cube 1 | D | Top sphere | 333 | 5050.0 | 5058.1 | 8.1 | 20.0 | Pass |
Cube 1 | D | Bottom sphere | 332 | 5050.0 | 5057.8 | 7.8 | 20.0 | Pass |
LiDAR FMU | D | Top sphere | 336 | 5050.0 | 5055.4 | 5.4 | 20.0 | Pass |
LiDAR FMU | D | Bottom sphere | 338 | 5050.0 | 5055.2 | 5.2 | 20.0 | Pass |
6.3. Asymmetric Tests
The real and simulation test setups were created to perform the asymmetric tests, as shown in Figure 21, according to the specifications given in Section 4.4.
We used an extra stand to ensure no movement in the steel bar compared to the previous tests. In addition, the sphere targets have magnet holding that makes them easy to fix on the steel bar. The distance between the sphere targets was for the asymmetric tests. Cube 1 and LiDAR FMU use the same scan pattern as symmetric tests for these tests. The simulation and real results for the asymmetric tests are enumerated in Table 5.
Table 5.
Test Position |
Target | No. of Points (1) |
Reference Distance to Target (mm) |
Measured Distance (mm) |
Distance Error (mm) |
MPE (mm) |
Pass/Fail | |
---|---|---|---|---|---|---|---|---|
Cube 1 | A | Center sphere | 332 | 5050.0 | 5057.7 | 7.7 | 20 | Pass |
Cube 1 | A | Left sphere | 323 | 5050.0 | 5058.3 | 8.3 | 20 | Pass |
LiDAR FMU | A | Center sphere | 339 | 5050.0 | 5055.7 | 5.7 | 20 | Pass |
LiDAR FMU | A | Left sphere | 325 | 5050.0 | 5055.9 | 5.9 | 20 | Pass |
Cube 1 | B | Top sphere | 319 | 5050.0 | 5058.2 | 8.2 | 20 | Pass |
Cube 1 | B | Center sphere | 328 | 5000.0 | 5006.9 | 6.9 | 20 | Pass |
LiDAR FMU | B | Top sphere | 323 | 5050.0 | 5055.5 | 5.5 | 20 | Pass |
LiDAR FMU | B | Center sphere | 331 | 5000.0 | 5005.4 | 5.4 | 20 | Pass |
Cube 1 | C | Top sphere | 317 | 5000.0 | 5008.8 | 8.8 | 20 | Pass |
Cube 1 | C | left sphere | 322 | 5050.0 | 5057.4 | 7.4 | 20 | Pass |
LiDAR FMU | C | Top sphere | 323 | 5000.0 | 5005.7 | 5.7 | 20 | Pass |
LiDAR FMU | C | Left sphere | 324 | 5050.0 | 5055.5 | 5.5 | 20 | Pass |
Cube 1 and the LiDAR FMU model pass inside, symmetric, and asymmetric tests because the number of received points from the sphere targets is more than 300, and the distance error is less than 20 . Furthermore, the real sensor and LiDAR FMU use the same scan pattern; the simulated and real objects’ sizes and orientations are also similar. That is why the number of received points obtained from the surface of actual and simulated spheres matches pretty well. However, the distance error of the real measurement is slightly higher than the simulation results for all the tests given above; a possible reason behind this deviation is an uncertainty in the reference measurement due to human error because it was very challenging to align the RI laser point to the center of the sphere target.
6.4. Relative Range Tests
A test setup, as given in Section 4.5, was created to perform the relative range tests. The static simulation scene and real measurement setups are shown in Figure 22.
The reference distance is calculated from the sensor origin to the center of the target. The reference position A was at 6 , while three test positions, B, C, and D, were at 8, 9, and 10 , respectively. The relative test positions from the reference position were = 2 , = 3 , and = 4 . The center of the DUTs and the fiducial point of the rectangular plate were collinear. The results of the relative range tests are given in Table 6.
Table 6.
Target Position |
No. of Points (1) |
Reference Distance to Target (mm) |
Measured Distance (mm) |
Distance Error (mm) |
MPE (mm) |
(mm) |
Pass/Fail | |
---|---|---|---|---|---|---|---|---|
Cube 1 | AB | 451 | 2000.0 | 2005.7 | 5.7 | 20 | 1.5 | Pass |
Cube 1 | AC | 290 | 3000.0 | 3005.7 | 5.7 | 20 | 1.7 | Pass |
Cube 1 | AD | 208 | 4000.0 | 4007.3 | 7.3 | 20 | 1.8 | Pass |
LiDAR FMU | AB | 462 | 2000.0 | 2005.5 | 5.5 | 20 | 1.1 | Pass |
LiDAR FMU | AC | 298 | 3000.0 | 3005.5 | 5.4 | 20 | 0.6 | Pass |
LiDAR FMU | AD | 217 | 4000.0 | 4005.5 | 5.4 | 20 | 0.3 | Pass |
Cube 1 and LiDAR FMU pass the relative range tests because the number of received points from the plate target is more than 100, and the distance error is below 20 . It should also be noted that the relative range test distance error is lower than the inside, symmetric, and asymmetric tests because it was easy to point the RI to the fiducial point of the plate target as compared to the sphere target.
6.5. Uncertainty Budget for ASTM E3125-17 Tests
This section provides the uncertainty budget for all the ASTM E3125-17 tests performed in real and virtual environments.
6.5.1. Uncertainty Budget of Real Measurements
Contribution of RI (external influences): The RI has a range accuracy of ± , from to 10 , with a confidence level of 95%. That is why we consider a range uncertainty due to the RI for the ASTM E3125-17 tests because we place the targets within the 10 .
Contribution of misalignment between the target and RI center (external influences): We aligned the center of the targets and the laser tracker of RI manually, and it is tough to always aim the laser tracker in the center of the sphere compared to the plate target. The highest standard uncertainty due to this factor was for the top sphere of test position C of symmetric tests. However, for all the other tests, the standard uncertainty due to this factor is less than .
Contribution of environmental conditions (external influences): All the tests were performed in the lab; therefore, environmental conditions’ influence on the measurements is negligible.
Contribution of DUT internal influences (internal influences): The ranging error due to the internal influences of the DUT is for all the tests. These internal influences include the ranging error due to the internal reflection of the sensor, detector losses, peak detection algorithm, and precision loss due to the spherical coordinates conversion to the cartesian coordinates. It should be noted that the distance error due to the sensor’s internal influences may vary depending on the temperature (see point 3 above).
6.5.2. Uncertainty Budget of Simulation
Contribution of DUT (internal influences): As given above, the LiDAR FMU simulation model considers the exact scan pattern, signal processing chain, and sensor-related effect of Blickfeld Cube 1. Therefore, the uncertainty due to the internal influences of the sensor model is (see 4 above).
Contribution of environmental conditions effect model (external influences): Environmental conditions effects are not modeled for these tests.
6.6. Comparison of Simulation and Real Measurements Results
The simulation and real measurements show good qualitative agreement. Therefore, we use the mean absolute percentage error metric (MAPE) to quantify the difference between the simulation and real measurement results for all the tests described above.
(12) |
where is the simulated value, the measured value is denoted by , and n shows the total number of data points [44]. The for the point-to-point distance estimation d is , and for the number of received points , .
6.7. User-Selected Tests
The inside, symmetric, asymmetric, and relative range tests were conducted in the lab. Therefore, we conducted several static tests to evaluate DUT measurement performance in sunlight at the proving ground. First, as shown in Figure 23, we recorded the daylight and modeled it in the LiDAR FMU [23]. We use a 10% reflective Lambertian plate and a Toyota Prius as targets for these tests, as shown in Figure 24. The ego and target vehicles were equipped with a global navigation satellite/inertial navigation system (GNSS/INS) for reference measurements. The scan pattern used by Cube 1 and LiDAR FMU for the user-defined tests is given in Figure 3. The exemplary LiDAR points received from the simulated and real Lambertian plate and vehicle targets are shown in Figure 25 and Figure 26. In addition, the simulation and real measurement results for the user-selected tests are given in Figure 27 and Figure 28.
The results show that the derived point-to-point distance estimation error of real and virtual LiDAR sensors is less than the MPE. However, the distance error is increased to if the vehicle target is placed at a distance of 20 in daylight. This is because the sunlight raises the background noise of the LiDAR signal, and point clouds become noisy and dispersed, shifting the derived point of the vehicle target. It should be noted that we use the manual estimation method to calculate the derived point on the trunk of the target vehicle. Furthermore, the dimensions of the objects can be estimated from the received real and virtual LiDAR point clouds. Moreover, the simulation and actual measurements show good agreement for the tests performed on the proving ground. For example, the MAPE for the number of received points is 3.8%, and for the distance d, is 0.04%. The simulation and real measurement results show a good correlation because the sensor model is developed with high fidelity, and real-world scenarios are replicated in the simulation with high accuracy by using the GNSS/INS data without manual interpolation.
6.8. Influence of ASTM Standard KPIs on Object Detection
Deep learning and neural networks are used for object recognition, segmentation, and classification from the 3D LiDAR point clouds. However, if the 3D imaging and position estimation of the LiDAR sensor degrades, it will influence the performance of the object recognition algorithm and ADAS. To show this and to ensure the usability of simulation models validated with the ASTM standard, we trained the state-of-the-art deep learning-based PointPillars network [45] for object detection using synthetic LiDAR data. Then, we tested it with real and simulated data of the vehicle target shown in Figure 26. We use the average orientation similarity (AOS) metric [46] to find the correlation between the ground truth 3D orientation of the object and the 3D orientation of the object estimated by the object detection algorithm. The object detection confidence score (ODCS) and average orientation similarity (AOS) of the real and simulated vehicle targets are given in Table 7.
Table 7.
Target | Distance (m) |
ODCS (%) | AOS (%) | |
---|---|---|---|---|
Cube 1 | Vehicle | 12.0 | 94.2 | 98.1 |
Cube 1 | Vehicle | 15.5 | 92.8 | 97.7 |
Cube 1 | Vehicle | 20.0 | 90.6 | 97.2 |
LiDAR FMU | Vehicle | 12.0 | 95.3 | 98.8 |
LiDAR FMU | Vehicle | 15.5 | 94.6 | 98.6 |
LiDAR FMU | Vehicle | 20.0 | 93.3 | 98.5 |
The results show that the ODCS of the PointPillar from simulated and real data is more than 90% up to 20 . In addition, the AOS metric results are more than 98% for real and virtual LiDAR data because the object detection algorithm can correctly estimate the target dimensions and position from synthetic and real data, as shown in Figure 29.
Furthermore, the simulation and real measurement also show good agreement because the MAPE for the number of received point clouds and the distance error is negligible. Afterward, we test the object detection algorithm with point clouds that do not meet the KPIs of the ASTM standard, as shown in Figure 30, to investigate the degradation in the performance of the object recognition algorithm. It should be noted that the LiDAR FMU vertical FoV is set in such a way so that the number of received points from the vehicle is decreased. The distance offset is also added intentionally to receive an inaccurate distance in the point cloud to see the impact of point-to-point distance estimation error on the performance of the object recognition algorithm.
Figure 31 shows that if the object recognition algorithm can not estimate the shape or size of the object from the given point clouds, it will degrade its confidence about the object’s existence. In addition, if the point-to-point distance or 3D position estimation performance of the LiDAR sensor degrades, then the object detection algorithm will detect the target at the wrong position. These two key performance indicators are critical in evaluating the measurement performance of automotive LiDAR sensors. Therefore, initial results show that the specification and test frameworks defined in the ASTM E3125-17 standard can be used to evaluate the measurement performance of virtual and real automotive LiDAR sensors, at least at the early stage of development, because the standard consists of static scenarios that can be implemented easily in real and virtual environments.
7. Conclusions
In this work, we evaluated the point-to-point distance measurement performance of the MEMS-based automotive LiDAR sensor Cube 1 from Blickfeld and its simulation model according to the ASTM E3125-17. The LiDAR simulation model is developed using the standardized interfaces OSI and FMI. It considers the accurate modeling of the scan pattern and complete signal processing steps of Blickfeld Cube 1. The ASTM E3125-17 standard defines the specifications and static test cases to evaluate the measurement performance of the LiDAR sensors.
The virtual and real automotive LiDAR sensor passes all the tests defined in the standard because the point-to-point distance estimation error is less than MPE, and the size of the objects can be estimated correctly from the received point clouds. The simulation and real measurements show good agreement. For example, for the ASTM tests, the MAPE of the number of points and point-to-point distance estimation d of the virtual and real targets are 1.4% and 0.04%.
In addition, we also performed measurements at the Jtown proving ground for verifying DUT’s operational performance in daylight conditions. However, the point-to-point distance error is less than MPE. Furthermore, the object’s dimensions and size can be estimated correctly from the received points. The real-world scenarios and environmental conditions are replicated in the virtual environment to obtain the synthetic data. The MAPE for the actual and simulation data is for the number of received points and for the distance estimation d. It should be noted that it is very challenging to model 100% real-world environmental conditions in the virtual environment, which is why the MAPE of user-defined tests for the is increased to 3.8% from 1.4%.
To realize the effect of the MAPE of the simulation and real measurement for the KPIs defined in ASTM E3125-17 standard on object recognition, we trained a deep learning-based PointPillars network by using the simulated point cloud data and testing it with the real and simulated point clouds. The simulated and real point clouds also show good agreement at the object level; for instance, the MAPE for ODCS and AOS of the simulated and real point cloud is 2.0% and 1.0%, respectively. If the LiDAR sensor 3D imaging and position estimation performance drop, it influences the performance of the object detection algorithm. For instance, a distance error will lead the AOS from 98.8% to 0%.
Therefore, it is concluded that virtual and real LiDAR sensors work for their intended use, at least for static use cases. Furthermore, the 3D imaging and point-to-point distance estimation capability are critical to evaluate the operational performance of automotive virtual and real LiDAR sensors. That is why it can also be concluded from the initial results that the specifications and test framework defined in ASTM E3125-17 can be used to evaluate the measurement performance of automotive virtual and real LiDAR sensors at the early stages of development.
8. Outlook
All the tests defined in ASTM E3125-17 are static. Furthermore, no standardized tests are available to evaluate the sensor performance in dynamic tests. That is why, in the next steps, a set of dynamic test cases will be defined from expert knowledge to assess the 3D imaging, point-to-point distance estimation, angular resolution, range resolution, and range accuracy measurement performance of real and virtual LiDAR sensor performance. Moreover, we will further investigate the impact of these KPIs of the LiDAR sensor on the performance of the object recognition algorithm.
Acknowledgments
The authors want to thank the DIVP consortium for providing technical assistance and equipment for measurements at Jtown.
Abbreviations
The following abbreviations are used in this manuscript:
ACC | Adaptive cruise control |
AOS | Average orientation similarity |
ADAS | Advanced driver-assistance system |
BSD | Blind-spot detection |
DUT | Device under test |
Effect Engine | FX engine |
FMU | Functional mock-up unit |
FMI | Functional mock-up interface |
FCW | Forward collision warning |
FoV | Field of view |
GNSS | Global navigation satellite system |
INS | Inertial navigation system |
LDWS | Lane departure warning system |
LiDAR | Light detection and ranging |
LDM | Laser and detector module |
LSSF | Least square sphere fit |
LSPF | Least square plane fit |
MPE | Maximum permissible error |
MEMS | Micro-electro-mechanical systems |
MAPE | Mean absolute percentage error |
OSI | Open simulation interface |
OEMs | Original equipment manufacturers |
OSMP | OSI sensor model packaging |
OPA | Optical phased array |
ODCS | Object detection confidence score |
RADAR | Radio detection and ranging |
RTDT | Round-trip delay time |
RI | Reference instrument |
RMS | Root mean square |
Author Contributions
Conceptualization, A.H.; methodology, A.H.; software, A.H. and Y.C. (YongJae Cho); validation, A.H.; formal analysis, A.H. and Y.C. (YongJae Cho); data curation, A.H., Y.C. (YongJae Cho), M.H.K. and S.K.; writing—original draft preparation, A.H.; writing—review and editing, A.H., Y.C. (YongJae Cho), M.P., M.H.K., L.H., L.K., M.F., M.S., Y.C. (Yannik Cichy), S.K., T.Z., T.P., H.I., M.J. and A.W.K.; visualization, A.H.; supervision, A.W.K. and T.Z.; project administration, T.Z.; All authors have read and agreed to the published version of the manuscript.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
Funding Statement
This paper shows results from the project VIVID (German Japan Joint Virtual Validation Methodology for Intelligent Driving Systems). The authors acknowledge the financial support by the Federal Ministry of Education and Research of Germany in the framework of VIVID, grant number 16ME0170. The responsibility for the content remains entirely with the authors.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
References
- 1.Bilik I. Comparative Analysis of Radar and Lidar Technologies for Automotive Applications. IEEE Intell. Transp. Syst. Mag. 2022;15:244–269. doi: 10.1109/MITS.2022.3162886. [DOI] [Google Scholar]
- 2.Dey J., Taylor W., Pasricha S. VESPA: A framework for optimizing heterogeneous sensor placement and orientation for autonomous vehicles. IEEE Consum. Electron. Mag. 2020;10:16–26. doi: 10.1109/MCE.2020.3002489. [DOI] [Google Scholar]
- 3.Winner H., Hakuli S., Lotz F., Singer C. Handbook of Driver Assistance Systems. Springer International Publishing; Amsterdam, The Netherlands: 2014. pp. 405–430. [Google Scholar]
- 4.Kochhar N. A Digital Twin for Holistic Autonomous Vehicle Development. ATZelectron. Worldw. 2022;16:8–13. doi: 10.1007/s38314-020-0579-2. [DOI] [Google Scholar]
- 5.Synopsys, What is ADAS? 2021. [(accessed on 26 August 2021)]. Available online: https://www.synopsys.com/automotive/what-is-adas.html.
- 6.Fersch T., Buhmann A., Weigel R. The influence of rain on small aperture LiDAR sensors; Proceedings of the 2016 German Microwave Conference (GeMiC); Bochum, Germany. 14–16 March 2016; pp. 84–87. [Google Scholar]
- 7.Bellanger C. They Tried the Autonomous ZOE Cab. Here Is What They Have to Say. [(accessed on 27 July 2022)]. Available online: https://www.renaultgroup.com/en/news-on-air/news/they-tried-the-autonomous-zoe-cab-here-is-what-they-have-to-say/
- 8.Valeo Valeo’s LiDAR Technology, the Key to Conditionally Automated Driving, Part of the Mercedes-Benz DRIVE PILOT SAE-Level 3 System. [(accessed on 27 July 2022)]. Available online: https://www.valeo.com/en/valeos-lidar-technology-the-key-to-conditionally-automated-driving-part-of-the-mercedes-benz-drive-pilot-sae-level-3-system/
- 9.Chen Y. Luminar Provides Its LiDAR Technology for Audi’s Autonomous Driving Startup. [(accessed on 27 July 2022)]. Available online: https://www.ledinside.com/press/2018/12/luminar_provides_its_lidar_technology_audi_autonomous_driving_startup.
- 10.Cooper M.A., Raquet J.F., Patton R. Range Information Characterization of the Hokuyo UST-20LX LIDAR Sensor. Photonics. 2018;5:12. doi: 10.3390/photonics5020012. [DOI] [Google Scholar]
- 11.Rachakonda P., Muralikrishnan B., Shilling M., Sawyer D., Cheok G., Patton R. An overview of activities at NIST towards the proposed ASTM E57 3D imaging system point-to-point distance standard. J. CMSC. 2017;12:1–14. [PMC free article] [PubMed] [Google Scholar]
- 12.Beraldin J.A., Mackinnon D., Cheok G., Patton R. Metrological characterization of 3D imaging systems: Progress report on standards developments, EDP Sciences; Proceedings of the 17th International Congress of Metrology; Paris, France. 21–24 September 2015; p. 13003. [Google Scholar]
- 13.Muralikrishnan B., Ferrucci M., Sawyer D., Gerner G., Lee V., Blackburn C., Phillips S., Petrov P., Yakovlev Y., Astrelin A., et al. Volumetric performance evaluation of a laser scanner based on geometric error model. Precis. Eng. 2015;153:107398. doi: 10.1016/j.precisioneng.2014.11.002. [DOI] [Google Scholar]
- 14.Laconte J., Deschênes S.P., Labussière M., Pomerleau F., Milligan S. Lidar measurement bias estimation via return waveform modelling in a context of 3d mapping; Proceedings of the 2019 International Conference on Robotics and Automation (ICRA); Montreal, QC, Canada. 20–24 May 2019; pp. 8100–8106. [Google Scholar]
- 15.Lambert J., Carballo A., Cano A.M., Narksri P., Wong D., Takeuchi E., Takeda K. Performance analysis of 10 models of 3D LiDARs for automated driving. IEEE Access. 2015;8:131699–131722. doi: 10.1109/ACCESS.2020.3009680. [DOI] [Google Scholar]
- 16.Gumus K., Erkaya H. Analyzing the geometric accuracy of simple shaped reference object models created by terrestrial laser. Int. J. Phys. Sci. 2015;6:6529–6536. [Google Scholar]
- 17.Hiremagalur J., Yen K.S., Lasky T.A., Ravani B. Testing and Performance Evaluation of Fixed Terrestrial 3D Laser Scanning Systems for Highway Applications. Transp. Res. Rec. 2009;2098:29–40. doi: 10.3141/2098-04. [DOI] [Google Scholar]
- 18.Gomes T., Roriz R., Cunha L., Ganal A., Soares N., Araújo T., Monteiro J. Evaluation and Testing Platform for Automotive LiDAR Sensors. Appl. Sci. 2022;12:13003. doi: 10.3390/app122413003. [DOI] [Google Scholar]
- 19.Nahler C., Steger C., Druml N. Quantitative and Qualitative Evaluation Methods of Automotive Time of Flight Based Sensors; Proceedings of the 23rd Euromicro Conference on Digital System Design (DSD); Kranj, Slovenia. 26–28 August 2020; pp. 651–659. [Google Scholar]
- 20.HESAI, Hesai Acts as Group Leader for ISO Automotive Lidar Working Group, 2022. [(accessed on 15 March 2022)]. Available online: https://www.hesaitech.com/en/media/98.
- 21.DIN, DIN SPEC 91471 Assessment Methodology for Automotive LiDAR Sensors. [(accessed on 27 July 2022)]. Available online: https://www.din.de/en/wdc-beuth:din21:352864796.
- 22.ASTM International . ASTM E3125-17 Standard Test Method for Evaluating the Point-to-Point Distance Measurement Performance of Spherical Coordinate 3D Imaging Systems in the Medium Range. ASTM International; West Conshohocken, PA, USA: 2017. [Google Scholar]
- 23.Haider A., Pigniczki M., Köhler M.H., Fink M., Schardt M., Cichy Y., Zeh T., Haas L., Poguntke T., Jakobi M., et al. Development of High-Fidelity Automotive LiDAR Sensor Model with Standardized Interfaces. Sensors. 2022;22:7556. doi: 10.3390/s22197556. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Wang L., Muralikrishnan B., Lee V., Rachakonda P., Sawyer D., Gleason J. A first realization of ASTM E3125-17 test procedures for laser scanner performance evaluation. Measurement. 2020;153:107398. doi: 10.1016/j.measurement.2019.107398. [DOI] [Google Scholar]
- 25.ASAM e.V. OSI Sensor Model Packaging Specification. [(accessed on 7 June 2021)]. Available online: https://opensimulationinterface.github.io/osi-documentation/osi-sensor-model-packaging/doc/specification.html.
- 26.Wang D., Watkins C., Xie H. MEMS Mirrors for LiDAR: A Review. Micromachines. 2020;11:456. doi: 10.3390/mi11050456. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Thakur R. Scanning LIDAR in Advanced Driver Assistance Systems and Beyond: Building a road map for next-generation LIDAR technology. IEEE Consum. Electron. Mag. 2016;5:48–54. doi: 10.1109/MCE.2016.2556878. [DOI] [Google Scholar]
- 28.Blickfeld “Cube 1 Outdoor v1.1”, Datasheet, 2022. [(accessed on 10 January 2023)]. Available online: https://www.blickfeld.com/wp-content/uploads/2022/10/blickfeld_Datasheet_Cube1-Outdoor_v1.1.pdf.
- 29.Holmstrom S.T.S., Baran U., Urey H. MEMS Laser Scanners: A Review. J. Microelectromech. Syst. 2014;23:259–275. doi: 10.1109/JMEMS.2013.2295470. [DOI] [Google Scholar]
- 30.Blickfeld Gmbh Crowd Analytics Privacy-Sensitive People Counting and Crowd Analytics. [(accessed on 5 March 2023)]. Available online: https://www.blickfeld.com/applications/crowd-analytics/
- 31.Petit F. Myths about LiDAR Sensor Debunked. [(accessed on 5 July 2022)]. Available online: https://www.blickfeld.com/de/blog/mit-den-lidar-mythen-aufgeraeumt-teil-1/
- 32.Müller M. The Blickfeld Scan Pattern: Eye-Shaped and Configurable September 2, 2020. [(accessed on 23 March 2022)]. Available online: Https://www.blickfeld.com/blog/scan-pattern/
- 33.Blickfeld Scan Pattern. [(accessed on 7 July 2022)]. Available online: https://docs.blickfeld.com/cube/latest/scan_pattern.html.
- 34.Müller M. LiDAR Explained – Understanding LiDAR Specifications and Performance. [(accessed on 8 March 2023)]. Available online: https://www.blickfeld.com/blog/understanding-lidar-specifications/#Range-Precision-and-Accuracy.
- 35.Hexagon Metrology, Romer Absolute Arm, 2015. [(accessed on 10 January 2023)]. Available online: https://www.atecorp.com/atecorp/media/pdfs/data-sheets/hexagon-romer-absolute-arm-datasheet-1.pdf?ext=.pdf.
- 36.JARI, Automated Driving Test Center (Jtown) 2022. [(accessed on 10 January 2023)]. Available online: https://www.jari.or.jp/en/contract_testing_equipment/facilities_equipment/jtown/
- 37.Jekel C. Master’s Thesis. Stellenbosch University; Stellenbosch, South Africa: 2016. Obtaining Non-Linear Orthotropic Material Models for PVC-Coated Polyester via Inverse Bubble Inflation. [Google Scholar]
- 38.Harris C.R., Millman K.J., van der Walt S.J., Gommers R., Virtanen P., Cournapeau D., Wieser E., Taylor J., Berg S., Smith N.J., et al. Array programming with NumPy. Nature. 2020;585:357–362. doi: 10.1038/s41586-020-2649-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Lang S., Murrow G. Geometry. Springer; New York, NY, USA: 1988. The Distance Formula; pp. 110–122. [Google Scholar]
- 40.Leica Geosystem, Leica DISTO S910, 2022. [(accessed on 26 March 2022)]. Available online: https://www.leicadisto.co.uk/shop/leica-disto-s910/
- 41.ASAM e.V. Open Simulation Interface (OSI) 2022. [(accessed on 30 June 2022)]. Available online: https://opensimulationinterface.github.io/open-simulation-interface/index.html.
- 42.Shakarji C. Least-squares fitting algorithms of the NIST algorithm testing system. J. Res. Natl. Inst. Stand. Technol. 1998;103:633–641. doi: 10.6028/jres.103.043. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Community B.O. Blender—A 3D Modelling and Rendering Package, Stichting Blender Foundation, Amsterdam. 2018. [(accessed on 10 November 2022)]. Available online: http://www.blender.org.
- 44.Swamidass P.M., editor. Encyclopedia of Production and Manufacturing Management. Springer; Boston, MA, USA: 2000. Mean Absolute Percentage Error (MAPE) p. 462. [Google Scholar]
- 45.Lang A.H., Vora S., Caesar H., Zhou L., Yang J., Beijbom O. Pointpillars: Fast encoders for object detection from point clouds; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; Honolulu, HI, USA. 21–26 July 2017; pp. 12697–12705. [Google Scholar]
- 46.Geiger A., Lenz P., Urtasun R. Are we ready for autonomous driving? The KITTI vision benchmark suite; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; Providence, RI, USA. 16–21 June 2012; pp. 3354–3361. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable.