Skip to main content
HardwareX logoLink to HardwareX
. 2023 May 6;14:e00426. doi: 10.1016/j.ohx.2023.e00426

ROMR: A ROS-based open-source mobile robot

Linus Nwankwo 1,, Clemens Fritze 1, Konrad Bartsch 1, Elmar Rueckert 1
PMCID: PMC10197097  PMID: 37216020

Graphical abstract

graphic file with name ga1.jpg

Keyword: Mobile robot, ROS, open-source robot, differential-drive robot, autonomous robot

Abstract

Currently, commercially available intelligent transport robots that are capable of carrying up to 90 kg of load can cost $5,000 or even more. This makes real-world experimentation prohibitively expensive and limits the applicability of such systems to everyday home or industrial tasks. Aside from their high cost, the majority of commercially available platforms are either closed-source, platform-specific or use difficult-to-customize hardware and firmware. In this work, we present a low-cost, open-source and modular alternative, referred to herein as “ROS-based Open-source Mobile Robot (ROMR)”. ROMR utilizes off-the-shelf (OTS) components, additive manufacturing technologies, aluminium profiles, and a consumer hoverboard with high-torque brushless direct current (BLDC) motors. ROMR is fully compatible with the robot operating system (ROS), has a maximum payload of 90 kg, and costs less than $1500. Furthermore, ROMR offers a simple yet robust framework for contextualizing simultaneous localization and mapping (SLAM) algorithms, an essential prerequisite for autonomous robot navigation. The robustness and performance of the ROMR were validated through real-world and simulation experiments. All the design, construction and software files are freely available online under the GNU GPL v3 license at https://doi.org/10.17605/OSF.IO/K83X7. A descriptive video of ROMR can be found at https://osf.io/ku8ag.


Specification table

Hardware name A ROS-based open-source mobile robot (ROMR)
Subject area
  • Robotics

  • Sensor fusion

  • Simultaneous localisation and mapping (SLAM)

  • Navigation

  • Teleoperation

  • Research and development in robotics

  • General

Hardware type
  • Mechatronic

  • Robotic

Open source license GNU GPL v3
Cost of hardware < $1500
Source file repository https://doi.org/10.17605/OSF.IO/K83X7

1. Hardware in context

Intelligent transport robots (ITRs) are becoming an integral part of our daily activities in recent years [1], [2], [3]. Their application for day-to-day activities especially in industrial logistics [4], warehousing [5], household tasks [6], [7], etc., provides not only a cleaner and safer work environment but also helps to reduce the high costs of production. These robots offer the potential for a significant improvement in industrial safety [8], productivity [9] and general operational efficiency [10]. However, several challenges such as the reduced capacity[11], [10], affordability [12], and the difficulties in modifying the inbuilt hardware and firmware still remain [13].

Although significant effort has been undertaken by the scientific community in recent years to develop a standardised low-cost mobile platform, there is no open-source system that fulfils the high industrial requirements. Most available open-source, low-cost platforms are still limited in their functions and features, i.e., they are commercially not available, do not match the required payloads for logistic tasks, or cannot be adapted. For example, while [14], [15], [16], are inexpensive, they cannot be used in day-to-day tasks that require transporting materials of high loads of 5 kg or more.

On the other hand, many commercially available industrial platforms exist that feature high payloads, see Table 1 for an overview. Unfortunately, they are expensive, closed-source, and platform-specific with inbuilt hardware, and firmware that may be difficult to modify [16]. This restrains many users’ ability to explore multiple customisation or reconfiguration options to accelerate the development of intelligent systems.

Table 1.

Comparison of existing mobile platforms to our ROMR development.

Robot Name ROS Payload (kg) Custom Cost(k$) OpenS
RMP Lite 220 50 2.99 ×
Ackerman Pro Smart 22 3.99 ×
Nvidia Carter 50 10.00
Panther UGV 80 15.90 ×
Tiago base 100 11.50 ×
Clearpath TurtleBot 4 9 1.90
Summit XL 65 11.50 ×
MIR100 100 24.00 ×
AgileX Scout 2.0 50 12.96 ×
Jackal J100 20 18.21 ×
4WD eXplorer 90 15.70 ×
ROSbot 2.0 10 2.34 ×
ROMR 90 1.50

Consequently, there is a crucial need to have low-cost robots with comparable features that can easily be scaled to adapt to any useful purpose. To this end, we propose ROMR, a modular, open-source and low-cost alternative for general-purpose applications including research, navigation [17], and logistics.

ROMR is fully compatible with ROS, has a maximum payload of 90 kg and costs less than $1500. It features several lidar sensor technologies for potential application for perception [18], simultaneous localisation and mapping [19], [20], deep learning tasks [21], and many more. Fig. 1, Fig. 8 present the pictorial view and the cyber-physical anatomy of ROMR respectively.

Fig. 1.

Fig. 1

ROMR is built from consumer hoverboard wheels with high torque brushless direct current (BLDC) motors. It utilises Arduino Mega Rev3, an Nvidia jetson Nano, and the components described in Table 4, Table 5. (a) front view (b) side view (c) back view.

Fig. 8.

Fig. 8

The ROMR hardware architecture is based on an ODrive board (light green) to actuate the motors, an Nvidia Jetson Nano (light blue) as a computing interface for high-level tasks, and an Arduino Mega Rev3 (orange) as low-level computing interface.

1.1. Related hardware platforms

In Table 1, we present a comparative evaluation with ROMR, similar hardware developed in recent years for research, navigation and logistics applications. Our comparison focused on some key features that define the open-sources, robustness and versatility of the robot design, e.g., ease of reconfiguration or modification, cost performance, load carrying capacity, and full compatibility with the ROS [22].

In Table 1, the ROS feature indicates whether the robot is fully compatible with ROS or not. Custom determines whether the platform satisfies easy modification of its design and integration of additional components. OpenS determines whether the hardware (electronic circuits, design files, etc.) and software (source codes, ROS packages, etc.) are fully open-source and maintained by the open-source community such that external hobbyists can replicate the same design without the need to contact the developer. Finally, the cost feature determines the affordability of the system. As shown in Table 1, the majority of the robots are not open-source and are expensive. This limits wide application, e.g., in research, navigation and logistics. Our ROMR has been developed as a low-cost open-source alternative. The approximate cost to redevelop it currently stands at less than $1,500.00. Recently, ROMR is been used for both B.Sc. and M.Sc. projects in our laboratory.

2. The ROMR description

ROMR is compactly and robustly designed to ensure stability, ease of integration of additional components, and low cost of reproducing the system. In this section, we describe the robot’s hardware and software platforms. Afterwards, we describe the technical specifications and tools, as well as the detailed architecture of the control unit.

2.1. Hardware description

We leveraged off-the-shelf (OTS) electronics that are commercially available online, additive manufacturing technologies (3D printing), and aluminium profiles for the robot’s structural design. The main reason for using aluminium profiles is to achieve a lightweight structure that can hold the hardware and associated electronics of the robot without increasing its overall weight. At the same time, the profiles could resist load stress and damage during everyday use. The profile bars with slots were also used to enclose all the electronics and power subsystems at the base of the robot for proper weight distribution. This increases the flexibility to connect any additional hardware component to it.

ROMR is equipped with an Arduino Mega Rev 3, an Nvidia Jetson Nano board, an Odrive 56 V V3.6 brushless direct current (DC) motor controller, and two 350 W hoverboard brushless motors with five inbuilt hall sensors. The Arduino board is responsible for low-level tasks, such as gesture-based control of the robot using an inertial measurement unit (IMU) sensor and teleoperation from remote-controlled (RC) devices. In contrast, the Jetson Nano board handles high-level processing tasks such as deep learning, SLAM, and ROS navigation tasks with the RGB-D cameras and LiDARs, etc. The two boards communicate with each other through the serial UART interface.

Furthermore, ROMR is powered by rechargeable lithium-ion batteries (36 V 4400mAh), which are affordable, inexpensive to maintain, and eco-friendly (i.e., they do not contain heavy metals such as lead or cadmium which are harmful to the environment and human health). Additionally, the robot is endowed with an Intel Realsense D435i RGB-D camera for visual perception and depth sensing, and an Intel Realsense T265 for localization and tracking. ROMR is also equipped with a 9-axis MPU 9250 IMU sensor for tracking and localisation. The IMU sensor is used also for gesture-based teleoperation which allows a non-robotics expert to intuitively control the robot using hand gestures (see SubSection 6.2.3 for more details). For 2D mapping, we used an RPlidar A2 M8 with a 360-degree field of view (FOV). This lidar has a maximum range distance of 16m and operates at a frequency of 10Hz. The lidar sensor allowed us to create a 2D occupancy-grid map of the environment, which was then used to support the ROMR navigation, localization, and obstacle detection within the environment.

We supported ROMR with a small caster wheel in addition to its drive wheels to increase stability, manoeuvrability, and proper weight distribution. Although it is possible to use a bigger or two caster wheels, however, we chose a smaller caster wheel to make it easier to navigate around tight spaces and obstacles and provide better stability and balance for the robot, especially if it has to change direction quickly or make sudden turns. Furthermore, by using a smaller caster wheel, the weight of the robot could be distributed more evenly, which can help to prevent tipping or loss of balance. Note, it is recommended to add a piece of rubber between the caster wheel and the ROMR’s base frame to compensate for hyperstaticity.

2.2. Software description

The robot’s main software is based primarily on the ROS framework, which runs on both Ubuntu 18.04 (ROS Melodic version) and Ubuntu 20.04 (ROS Neotic version). The ROS framework provides a set of tools, libraries, and conventions for building the robot system.

The software subsystems include the Arduino sketches, ODrive calibration programs, the ROS workspace containing the ROMR universal robot description format (URDF) files for a Gazebo simulation, the Gazebo plugins files, the ROMR meshes, the launch files, the joint and rviz configuration files, and the RPlidar packages. The files are listed in Table 6 and are published using the open-source license GNU GPL V3, which allows the community to reproduce, modify, redistribute, and republish them.

Table 6.

Software files.

Des. Name File Name Description Type Open source license File location
S1 sketches.zip Folder containing the Arduino sketches Arduino sketches GNU GPL v3 https://osf.io/r5cgp
S2 romr_robot.zip Folder containing the ROMR ROS files ROS files GNU GPL v3 https://osf.io/e4syc

2.3. Control unit description

The control unit includes multiple options for controlling the robot. In addition to the existing hardware described in SubSection 2.1, the control unit includes RC receivers and transmitters, an Android device running a ROS-mobile app, an IMU sensor, an nRF24L01 + module, and an additional Arduino board. The RC receivers and transmitters are used to provide manual control of the robot. The Android device running the ROS-mobile app serves as an alternative control interface for the robot. The nRF24L01 + module provides wireless communication between the control unit and the robot. The additional Arduino board is used to interface with the nRF24L01 + module and IMU sensor to handle wireless communication with the robot in case of gesture-based teleoperation. SubSections 6.2.1, 6.2.2, 6.2.3, and Fig. 11, Fig. 12, Fig. 14 provide insights into the architecture of the control unit.

Fig. 11.

Fig. 11

Setting up the RC control on the Turnigy 9x. The red line indicates PWM activation, the blue line indicates throttle control and the green line indicates steering control.

Fig. 12.

Fig. 12

The ROMR real-time control and monitoring with ROS-Mobile device (a) System communication structure (b) Control and monitoring from ROS-Mobile [30] or Android-based devices.

Fig. 14.

Fig. 14

Block diagram illustrating the architecture and flow of information during the gesture control approach.

2.4. Technical specifications and features

The technical specifications of ROMR are summarized in Table 2. Furthermore, in Table 3, we present the tools and key features of ROMR in line with other robots with comparable specifications. Initially, when envisioning the design, one of our core goals was to develop a low-cost scalable platform that robotic developers and the open-source community could easily adapt, to foster research in mobile navigation. For this reason, we tailored our design considerations based on this goal such that the ROMR should be:

  • Modular – To offer the users the opportunity to easily integrate additional parts or units and reconfigure them to suit their needs.

  • Portable and simple – To ensure that minimal and off-the-shelf hardware components could be used for its construction and replication. This would allow users not to worry about purchasing costly components and instead focus on the system’s functional design.

  • Low-cost and open-source – To ensure affordability and commercialisation of the system so that users can leverage the ROMR framework in any form to develop novel and trivial robotic applications.

  • Versatile and suitable – To ensure that it takes minimal time to be re-programmed for any useful purpose, whether navigation, logistics etc as well as adapt to new processes and changes in the environment.

  • Unique – To enable hobbyists and users to learn new tools, techniques and methods useful to accelerate the development of intelligent systems.

Table 2.

ROMR technical specifications. L Length, W Width, H Height, ϕ Wheel diameter.

Parameters Technical specifications
Robot dimensions L*W*H=0.46m*0.34m*0.43m
Wheel dimensions Drive wheel (ϕ=0.165m); caster wheel (ϕ=0.075m)
Inter-wheel distance 0.29m
Robot weight 17.1kg
Max. payload 90kg
Max. speed Up to 3.33m/s
Max. stable speed <2.5m/s
Battery capacity 36V,4400mAh
Motor type BLDC with 15 pole pairs (350 W x 2)
Ground clearance 0.065m
Operation environment Indoor and outdoor
Run time (full charge) Approximately 8 h with the robot weight (17.1kg) only

Table 3.

Overview of the ROMR hardware and software tools.

Features Tools
Actuation ODrive 56 V V3.6 brushless DC motor controller, Nvidia Jetson Nano, and Arduino Mega Rev 3
Sensing & feedback RPlidar A2, IMU, Depth cameras (Intel Realsense D435i & T265)
Operating system Ubuntu 20.04 (ROS Neotic) or Ubuntu 18.04 (ROS Melodic)
Communication ROS architecture (ROS C/C++ & ROS Python libraries), WiFi 802.11n, USB & Ethernet (for debugging)
Navigation & drive interfaces ROS navigation stack, position & joint trajectory controller, joystick, rqt-plugin, ROS-Mobile (Android devices), hand gesture, web-based GUI
SLAM Hector-SLAM, Cartographer, Gmapping, RTAB-Map, etc
Simulation & visualisation Gazebo, Rviz, MATLAB

3. Design files summary

The ROMR design files are categorised into three different units, (a) the mechanical unit, (b) the software unit, and (c) the power, sensors and electronics unit. Each of these units is briefly described in Table 4, Table 5, Table 6. Table 4 describes the additively manufactured part (3D printed) and the 3D CAD models of the aluminium profiles and their accessories. The CAD files were used to generate the universal robot description format (URDF) [23] description of the ROMR in order to simulate it using the ROS framework [22]. These parts were designed using the Solid Edge CAD tool. Table 5 lists all off-the-shelf (OTS) electronics components. Table 6 contains the software files. All design and construction files can be downloaded at our repository: https://doi.org/10.17605/OSF.IO/K83X7.

Table 4.

Summary of the ROMR mechanical structure unit.

Des. Description File type Open S. license File location
P1 Alum. 40 × 40 mm slot 8 .stp GNU GPL v3 https://osf.io/qpb2v
P2 Alum. 40 × 80 mm slot 8 .stp GNU GPL v3 https://osf.io/zmqe6
P3 Corner bracket I-type .stp GNU GPL v3 https://osf.io/63jwa
P4 Mounting bracket I-type .stp GNU GPL v3 https://osf.io/ncbu9
P5 Bottom cover .stp GNU GPL v3 https://osf.io/ae9um
P6 Top cover .stp GNU GPL v3 https://osf.io/9v5j4
P7 Swivel caster .stp GNU GPL v3 https://osf.io/g7pxb
P8 Screw/bolt .stp GNU GPL v3 https://osf.io/jcefn
P9 Corner bracket cover cap .stp GNU GPL v3 https://osf.io/sfgb9
P10 ROMR full assemble .asm GNU GPL v3 https://osf.io/jybzf
P11 0.42 × 0.32 × 0.15 m box .stp GNU GPL v3 https://osf.io/5v3ra
P12 Lock nut .stp GNU GPL v3 https://osf.io/dk3ha
P13 Front hole plate .stp GNU GPL v3 https://osf.io/yhuw9
P14 RPLidar base holder .stl (3D print) GNU GPL v3 https://osf.io/envdh
P15 Drive wheel .stp GNU GPL v3 https://osf.io/vwztk

Table 5.

OTS electronics, sensors and control devices.

Des. Description File type Qty File location
P16 Nvidia Jetson Nano B01 64 GB png 1 https://osf.io/72xtm
P17 Arduino Mega Rev 3 png 1 https://osf.io/d3qmj
P18 ODrive V3.6 56 V png 1 https://osf.io/jghnv
P19 IMU (Invensense MPU-9250 9DOF) png 1 https://osf.io/k2h35
P20 Intel Realsense D435i camera png 1 https://osf.io/xu368
P21 Intel Realsense T265 camera png 1 https://osf.io/6cj5r
P22 nRF24L01 + PA + LNA module png 1 https://osf.io/43kd6
P23 Turnigy 2.4 GHz 9X 8-Channel V2 transmitter & receiver png 1 https://osf.io/3wu2x
P24 125 mm & 225 mm M2M, M2F, F2F GPIO wires png 24 https://osf.io/kv982
P25 RPLidar A2 M8 png 1 https://osf.io/pej62
P26 36 V Lithium Ion battery 4400mAh png 1 https://osf.io/umskh
P27 Power bank 2400 mA png 1 https://osf.io/z98gv

3.1. Mechanical unit

The mechanical unit includes the additively manufactured parts, the 3D CAD models of the aluminium profiles, and their accessories as summarised in Table 4. The labels Pn with n=1,2,..., refers to the individual parts.

  • P1-P4,P7-P9 and P12 are aluminium profiles and accessories used for the robot’s chassis construction.

  • P5 and P6 are used to cover the base of the robot, where the electronic components are placed.

  • P10 is the full CAD assembly of the ROMR.

  • P11 and P13 are used for material carriage and for mounting the RGB-D cameras respectively.

  • P14 is 3D-printed to attach the RPlidar sensor.

  • P15 is the ROMR drive wheel from a consumer hoverboard scooter.

3.2. Power, sensors, and electronics units

Table 5 shows a summary of the used electronics, sensors and control devices, as well as the power sources. P16-P27 are off-the-shelf (OTS) components from different vendors. All vendors are listed in Table 7.

  • P16 is the main brain of the robot. It handles the high-level control task required to run all the sensing, perception, planning and control modules.

  • P17 is one of the most successful open-source platforms with relatively easy-to-use free libraries compared to other open-source micro-computers. It is used for the low-level control, to send the control command to P18 which in turn drives and steers the P15. Furthermore, it was used to handle communication between P16 and P18, as well as for any future compatible devices which support ROS serial communication e.g., Raspberry Pi, STM32, etc.

  • P18 is a high-performance, open-source brushless direct current (BLDC) motor driver from ODrive robotics [24]. It regulates all computations required to drive the two inbuilt hoverboard brushless DC motors with five hall-effect sensors. The sensors are used for the motor position feedback. Note that in the case of the P18 end-of-life (EOL), the Odrive Pro found at  https://odriverobotics.com/shop/odrive-pro can be used as the motor driver replacement since it provides even more advanced features than the one used in this work.

  • P19 is a 9-axis inertial measurement unit (IMU) sensor specifically used for tracking, localization and gesture-based control tasks.

  • P20 and P21 are 3D vision cameras for visual perception, depth sensing, tracking, localization and mapping tasks. Both cameras generate 3D image data of the task environment, process the data and then publish it to the appropriate topic in the ROS network.

  • P22 and P23 are communication devices for wireless control of the ROMR.

  • P24 are general-purpose input-out (GPIO) connection wires.

  • P25 is a 2D lidar with a 360-degree field of view (FOV), a maximum range distance of 16 m, operating at a frequency of 10 Hz. It is used for generating a 2D occupancy grid map of the robot’s operational environment and for collision detection and avoidance.

  • P26 and P27 are the system’s power sources for the motors, the sensors, and the control boards.

  • S1 and S2 are the folders containing the Arduino control programs and the ROMR ROS files respectively.

Table 7.

BOM for building ROMR, and the respective links of where they were purchased. The BOM reflects only the component prices and does not include labour costs (purchasing, manufacturing, marketing, …).

Designator Qty Unit cost () Total cost () Source of material Material type
Profile 40x40L I-type slot 8 (1.98 m long) 1 26.49 26.49 www.motedis.at Other
Profile 40x80L I-type slot 8 (1.1 m long) 1 25.64 25.64 www.motedis.at Other
Corner bracket I-type 20 0.51 10.20 www.motedis.at Other
Mounting bracket I-type 40 0.18 7.20 www.motedis.at Other
Bottom & Top cover (50 x 50 cm) 1 6.99 6.99 https://www.obi.at/ Other
Screw/bolt 100 0.13 12.64 www.motedis.at Other
Corner bracket cover cap 20 0.18 3.60 www.motedis.at Other
Swivel caster 1 4.51 4.51 www.motedis.at Other
Hoverboard brushless DC motor wheels 2 24.50 49.00 www.voltes.nl Other
Nvidia Jetson Nano B01 64 GB 1 260.22 260.22 www.amazon.de Other
Arduino Mega Rev 3 1 32.78 32.78 www.amazon.de Other
ODrive V3.6 56 V 1 249.00 249.00 www.odriverobotics.com Other
IMU (MPU-9250) 1 15.92 15.92 www.distrelec.at Other
nRF24L01 + PA + LNA module 1 9.99 9.99 www.amazon.de Other
Turnigy 2.4 GHz 9X 8-Channel V2 transmitter & receiver 1 69.99 69.99 www.hobbyking.com Other
125 mm & 225 mm M2M, M2F, F2F GPIO wires 1 3.99 3.99 www.amazon.de Other
36 V Lithium-Ion battery 4400mAh 1 33.63 33.63 https://de.aliexpress.com Other
22nF capacitors 6 0.05 0.30 https://www.conrad.at Other

4. Bill of materials (BOM) summary

Table 7 provides a summary of the bill of materials, which includes the lidar, the depth cameras, the Turnigy 2.4 GHz 9X 8-Channel V2 transmitter & receiver, the 22nF capacitors, and the MPU-9250 sensor, that were sourced in the laboratory. The BOM reflects only the component prices and does not include labour costs (purchasing, manufacturing, marketing, …).

5. Build instructions

Once the materials described in Table 4 and Table 5 are available, the subsequent task is to assemble them accordingly. To do that, the sequence of steps presented in this section should be followed. Some basic tools such as pliers, screwdrivers, a wire stripper, a 3D printer, a saw, a soldering iron and others are required for building the hardware.

5.1. Hardware build instruction

The building of the ROMR chassis is done using the parts described in Table 4. To assemble the chassis, we considered the system’s compactness to maintain an acceptable mode of operation for a light and reliable system not compromising stability. The mechanical construction is done in several steps as follows:

  • 1.

    Top base assembly: For this step, the parts required are P1,P3, and P8 as referenced in Table 4. First, cut out two each 0.38 m, 0.26 m and 0.24 m lengths of P1. For each end of the piece, use a tap to cut thread and a clearance hole of a diameter of 0.008 m (M8). A description of how to use a tap can be found here  https://www.wikihow.com/Use-a-Tap. Assemble the parts as illustrated in Fig. 2, by following the direction of the arrows. The first block shows the components required, the middle block represents the exploded view with the interconnection of the components, and finally, the last block shows the outcome of the assembly.

  • 2.

    Bottom and top chassis assembly: Parts required in this step are P2,P3,P4,P8,P9 and the result of step 1. Assemble the parts as illustrated in Fig. 3.

  • 3.

    Mounting of the caster and the drive wheels: Parts required are P4,P7,P8,P15 and the base from step 2. Note that P8 (M8) and P8 (M6) are required to mount P15 and P7 firmly to the base respectively. Assemble the parts as illustrated in Fig. 4.

  • 4.

    Mounting the bottom base plate: The base plate is rectangular green plastic meant to hold and protect all the electronics subsystems. Parts required are P5,P8 (M3) and the resulting base from step 3. These parts are assembled as shown in Fig. 5.

  • 5.

    Mounting the internal electronics, the power subsystems to the bottom base plate and the RGB-D cameras: Parts required are P13,P17,P18,P19,P20,P21,P22,P23,P26, and the result from step 4. A couple of M3 screws (P8) are required for mounting the electronic components at the respective position. Fig. 6 illustrates the procedure.

  • 6.

    Mounting the top cover, external electronics, and final coupling: Parts required are P1,P3,P4,P6,P8 (M3),P8 (M8),P9,P11,P16,P25, and the resulting hardware from step 5. Fig. 7 illustrates the procedure.

Fig. 2.

Fig. 2

Top base chassis assembly.

Fig. 3.

Fig. 3

Assembling the top and bottom chassis.

Fig. 4.

Fig. 4

Mounting of the caster and the drive wheels to the base frame.

Fig. 5.

Fig. 5

Mounting of the bottom plate to the base frame for attaching the electronics subsystems.

Fig. 6.

Fig. 6

Mounting of the internal electronics and power subsystems to the bottom base plate.

Fig. 7.

Fig. 7

Mounting of the external electronics and the final coupling.

5.2. General connection and wiring instruction

The electronics, vision and sensor subsystem of ROMR are composed of several components interconnected by energy links and bidirectional or unidirectional information links as shown in Fig. 8. Each link either sends or receives information from the interconnected components. The instruction below shows how the system wiring was done.

  • 1.

    ODrive BLDC controller (P18): The ODrive controller has to be wired to the motors as illustrated in Fig. 9. Each of the three phases of the motors has to be connected to the motor outputs M0 and M1. The order in which the motor phases are connected is not important. The ODrive controller will figure it out during a calibration phase. However, after calibration, the order cannot be changed. If changed, consider re-calibrating the motors.

    Furthermore, hoverboard motors are equipped with five hall sensors coloured red, black, blue, green and yellow for position feedback. Unfortunately, the ODrive controller has no noise-filtering capacitors and consequently, the hall sensors are susceptible to noise. To get consistent and clean readings from the hall-effect sensors, noise filtering capacitors are required to be connected to the corresponding ODrive’s J4 pinouts as described in Table 8.

    The required value of the filtering capacitor is approximately 22 nF. However, if you do not have exactly the required 22 nF, connecting two 47 nF (c1 = c2 = 47 nF) in series to obtain approximately 23.5 nF will also work. Also, a 50 W power resistor is required if the robot runs on a battery. This prevents the ODrive from unexpected shutdown as a result of regenerated energy into the battery. Usually, the resistors come along with the ODrive controller during supply.

  • 2.

    Arduino to ODrive interconnection: ODrive communicates with Arduino through a serial port or a universal asynchronous receiver/transmitter (UART). The UART pinouts are GPIO 1 (TX) and GPIO 2 (RX) which should be connected to the Arduino Arduino’s RX (pin 18) and TX (pin 17) respectively. Also, there is a need to connect the grounds (denoted by GND) of the Arduino and the ODrive controller boards.

  • 3.

    Power distribution: It is recommended to use two separate power sources for the ODrive controller and Jetson Nano to avoid a ground loop [25]. 36 V lithium-ion 4400mAh battery is wired directly to the ODrive motor controller, and an additional power-bank battery (P27) was used to power the Jetson which requires only 5 V 2500 mA.

  • 4.

    The nRF24L01 + module, MPU-9250 and Arduino connections: From Fig. 14, the control unit consists of P17,P19, and P22. These parts are required for the wireless transmission of data. The MPU-9250 and the Arduino are connected with four cables, the ground (GND), the power supply (VCC) and two cables for the I2C communication (SDA, SCL). The SDA pin of the MPU-9250 is connected to the SDA (pin 20) of the Arduino, and the SCL pin to the SCL (pin 21) of the Arduino. A power supply between 2.4 V and 3.6 V is needed. As a consequence, the 3.3 V power supply pin of the Arduino is recommended to be used [26]. The communication between the nRF24L01 + module and the Arduino is established via an SPI interface. In Fig. 10, a detailed description of the nRF24L01 + pinout can be seen.

    The SPI pins of the Arduino (Mega 2560 Rev3) are MISO pin 50; MOSI pin 51, and SCK pin 52. In this work, the pins are connected as follows: nRF2401 GND Arduino GND; nRF2401 VCC Arduino 3.3 V; nRF2401 CE Arduino digital 7; nRF2401 CSN Arduino digital 8; nRF2401 MOSI Arduino digital 51; nRF2401 SCK Arduino pin 52; nRF2401 MISO digital 50. Furthermore, at the robot unit (Fig. 14), the nRF24L01 + module and the Arduino are wired in the same way as the control unit.

  • 5.

    RC receiver (P23) wiring to Arduino: RC receivers are needed to drive the motor using pulse-width modulated (PWM) signals [28]. A typical RC receiver such as the one used in this work (P23) has three kinds of pins, two of which are GND and 5 V respectively, and the remaining are for PWM signals. Hoverboard motors require three signal pins, i.e., throttle, steering and enable [29]. Therefore, the receiver to the Arduino wiring is described as follows: RC_GND Arduino_GND, RC_5V Arduino_5V, RC_CH1 Arduino_port2, RC_CH2 Arduino_port3, and RC_CH3 Arduino_port18.

Fig. 9.

Fig. 9

Internal wiring of the ODrive board. On the bottom, the three phases of the motors (M0 and M1) are connected. On the top, 22nF capacitors are used as noise filters for the hall sensors.

Table 8.

Hall sensor wiring at the J4 signal port of the Odrive.

Hall wire J4 signal port
Red 5 V
Yellow A
Blue B
Green Z
Black GND

Fig. 10.

Fig. 10

Illustrated is nRF24L01 module with its pinout [27]. The module is used for the wireless communication between the Arduino at the remote control unit and the Arduino at the robot unit (see Fig. 14).

6. Operation instructions

To get started operating the ROMR in real-time for the first time, the first step is to power ON the robot by pressing the ON–OFF switch beside the Jetson nano board (P16) and start controlling it with the transmitter. Note that after powering ON the robot, a red LED blinks. You have to wait until the blinking stops, then it is ready to be used. However, if you have changed the default configuration, or probably wish to rebuild and reconfigure the robot from scratch to operate it in other modes, then this section provides step-by-step instruction to get started. Before these steps, make sure that all the robot parts are properly assembled and wired according to the instructions in Section 5.

6.1. Initial configuration and setup instruction

This section provides details about the initial configuration of the ROMR. It is recommended to follow the instruction in this section very careful as it determines how well the system would perform.

6.1.1. Nvidia Jetson Nano set up and ROS installation

To set up the Nvidia Jetson Nano, some basic tools such as a microSD card (32 GB minimum recommended), a USB keyboard and a mouse, a computer display (HDMI or DP) and a micro-USB power supply are required. The microSD card and micro-USB power supply usually come with the Jetson Nano during supply, if you purchased the full development kit. The setup instructions are as follows:

6.1.2. Setup the ODrive tool and calibrate the BLDC motors

To begin the initial configuration and calibration of the hoverboard BLDC motors, make sure that all the necessary wiring has been completed as described in subSection 5.2. Also, ensure that all relevant switches are turned ON, and the motors are positioned in such a way that they can freely move. The ODrive need to be connected to the host computer (the Nvidia Jetson Nano). ODrive has a python3 programming interface, called “odrivetool” for configuring the BLDC motors and commanding them to move at a specific number of revolutions per minute or rotations.

Therefore, python3 is required to be installed first on the host computer before setting up the “odrivetool”. The instruction for the setup can be found at  https://docs.odriverobotics.com. Alternatively, you could simply download and run the calibration script which has been prepared to avoid the tedious task of following the tutorial to calibrate the motors at  https://osf.io/awf9t. The script will configure the axes of the motors and their respective encoders as well as set motor parameters such as the velocity gain, the position gain, the bandwidth, and more. After the successful calibration, you can test the motors from the “odrivetool” command line to ensure that it is properly configured and ready to receive velocity commands. First, start the “odrivetool” and from the command line send the following commands to the motors:

graphic file with name fx5.jpg

Repeat the same test with the second motor that is connected to axis 1 of the ODrive board. If the configuration and calibration of the motors are correct, then both motors should spin until they receive 0 as commanded velocities. Or receives idle state commands. At any point during the calibration and testing with the interactive “odrivetool”, always use the code below to list calibration errors and to clear them.

graphic file with name fx6.jpg

6.1.3. Install Arduino integrated development environment (IDE) and connect to ROS

The Arduino IDE allows one to write software programs (sketches) and upload them to the Arduino board for robot control. The steps to set up the IDE, and connect it to ROS are described below:

  • 1.

    First, download the Arduino IDE at  https://www.arduino.cc/en/Guide and follow the onscreen instructions to set it up.

  • 2.

    Integrate the Arduino to communicate with the ROS via rosserial node. The rosserial_arduino package enables the Arduino to communicate with the Jetson Nano via a USB-A male to USB-B male cable. The setup instruction can be found at the  http://wiki.ros.org/rosserial_arduino. Note: It is advised to use a udev rule for the USB devices. This will allow the devices to be recognized and configured automatically when it is plugged in.

  • 3.
    Launch the ROS serial server by running the code below at the command line to ensure that the setup was successful.

    graphic file with name fx7.jpg

    For a detailed explanation of the above rosserial-python node, visit the ROS wiki address at  http://wiki.ros.org/rosserial_python#serial_node.py. Make sure that you check the port to which your Arduino is connected and the baud rate of your device. In our case, it is ttyACM0 and 115200 respectively. In your case, it may be different. Take note of it always.

  • 4.

    Connect the Arduino to the ODrive. First, install the ODriveArduino library. Clone or download the repository  https://github.com/odriverobotics/ODrive/tree/master/Arduino. From the Arduino IDE, select Sketch Include Library Add.ZIP Library and select the enclosed zip folder. Run the “ODriveArduinoTest.ino” sketch with the motors connected to ensure that it is properly configured and ready to accept commands. If everything went successfully, then the motors should move accordingly.

6.1.4. The MPU-9250 and nRF24L01 + modules setup

As shown in Fig. 14, the Arduino at the control unit reads the MPU-9250 data and forwards it to the robot via the nRF24L01 + module. To run the Arduino sketch, the “FaBo 202 9Axis MPU9250” library by Akira Sasaki released under the Apache license, version 2.0 must be installed, as well as the “rf24” library by TMRh20 Avamander released under the GNU general public license. The FaBo 202 9Axis MPU9250 library is used for reading the data measured by the MPU-9250 sensor, while the rf24 library is needed for the usage of the nRF24L01 + module. Both libraries can be found in the library manager of the Arduino IDE.

The Arduino at the robot unit primarily forwards the raw MPU-9250 data that it receives from the nRF24L01 + module to the Jetson Nano. This is done by publishing the data to the “imu/data_raw” topic of the ROS system, which is running on the Jetson Nano. The Jetson Nano converts the raw data into velocity commands that include a target linear velocity and a target rotation speed of the robot. The Arduino receives these velocity commands by subscribing to the “cmd_vel” topic. After the Arduino receives the velocity commands, it converts them into target speed values of the motors that are set on the ODrive board.

6.1.5. Setup the RPlidar

The RPlidar sensor provides the scan data required for mapping, localization and navigation purposes. It is connected to the Jetson Nano or the host computer through a USB serial port. The procedure for its setup is summarized in the following steps.

  • 1.
    Clone the RPlidar ROS packages  https://github.com/Slamtec/rplidar_ros to your ROS catkin workspace source directory and run the code below to build the rplidarNode and rplidarNodeClient.

    graphic file with name fx8.jpg

  • 2.
    Check the authority of the rplidar serial port by typing at the command window:

    graphic file with name fx9.jpg

    Take note of the port in which the USB is connected e.g., …/ttyUSB0. Add authority to write the USB:

    graphic file with name fx10.jpg

  • 3.
    Launch the RPlidar node to view and test if the setup was successful.

    graphic file with name fx11.jpg

    If correctly set up, you will obtain an output similar to the one displayed in Fig. 15b with the lidar scan represented as red dots.

Fig. 15.

Fig. 15

ROMR simulation platform. (a) The operational environment model with the robot models together with the sensor model. The blue lines are the lidar sensor scan of the operational environment in the Gazebo. (b) Rviz visualisation. The red dotted lines are the lidar scan showing the location of obstacles (or objects) within the robot environment.

After the above setups, the ROMR is ready for experimentation.

6.2. Experimentation & remote operation instruction

To allow a human operator to intuitively control the ROMR, we developed three remote control techniques. In this section, we provide step-by-step instructions on how these techniques can be implemented.

6.2.1. ROMR teleoperation from remote-control (RC) devices

As per default, ROMR is configured to operate in RC mode. However, if the default configuration has been altered or changed, then, the following steps must be taken to reconfigure it. Before these steps, make ensure that the ODrive controller has already been calibrated and configured to accept commands (see sub-subSection 6.1.2 for instructions). Also, it is important to ensure that the RC receiver is properly wired according to the instructions in subSection 5.2.

  • 1.

    Upload the “romr_remote_control.ino” sketch to Arduino. Before that, the “Metro” library has to be included in the Arduino IDE library. The “Metro” library can be downloaded from  https://github.com/thomasfredericks/Metro-Arduino-Wiring.

  • 2.

    With the motors switched off, move the RC transmitter sticks and monitor it from Arduino serial plotter. If there is communication between the receiver and the transmitter, you would obtain a similar response as the one shown in Fig. 11 from the Arduino serial plotter.

6.2.2. ROMR control from Android-based device

One of the main features of the ROMR is the ability to be teleoperated from any Android-based device. The idea is to alleviate the need for complex robot teleoperation devices such as a joystick, an RC transmitter, etc., and to provide an intuitive way of controlling the robot by simply touching the Android device screen. To achieve this, we leveraged the framework developed by Rottmann Nils et al. [30]. The setup is straightforward. First, you have to make sure that Arduino has been set up to communicate with ROS. If you have not done that yet, it is advisable to follow the instructions in Section 6. “rosserial” is very important for this section. Therefore, make sure that the “rosserial” python node (rosserial_python serial_node.py) is running properly. The whole communication structure is described in Fig. 12a.

As illustrated in Fig. 12, the “rosserial” python node allows all the compatible connected electronics to communicate directly with the robot using the ROS topics and messages. All the information between the interconnected systems is communicated with the help of the rosserial package. Make sure that the robot is switched ON, the battery is connected, and the wheels are free to spin. If you have changed the default ODrive calibration, make sure that the calibration is completed before continuing. Open the downloaded ROS-Mobile App, which enables ROS to control the robot’s joint velocities. The App supports linear (forward and backward movement) and angular (rotation around the z-axis) movements. See Fig. 12b for the setup. An SSH connection has to be established between the devices, and all the devices have to be on the same wireless network. The steps are summarised as follows:

  • 1.

    Download the ROS-Mobile App from the Google Playstore.

  • 2.

    Configure the IP address. First, connect the robot and the Android device to the same wireless network. From the command window terminal, type “ifconfig”, this will display the IP address, e.g., 192. 168.1.15.

  • 3.

    At the “master” node URI of the ROS-Mobile App, enter the IP address and 11311 for the “master” port. Ensure that roscore is running, and click on the connect button.

  • 4.
    Once the above steps are completed, upload the “ros_mobile_control.ino” sketch to Arduino. While the roscore is still running, run the rosserial python node in a separate terminal:

    graphic file with name fx12.jpg

    Take note of the …/dev/ttyACM0 Arduino port. It may be different in your case.

  • 5.

    At the “details” tab of the ROS-Mobile App, select “Add widget”, select “joystick” and set the XYZ-coordinates accordingly. Click on the “viz” tab to visualise and control the robot. Ensure that the rosserial python node is running and the cmd_vel topic is been subscribed to. Once done, the robot can be controlled by simply touching the respective coordinates on the screen.

6.2.3. Gesture-based control of the ROMR

Unlike the traditional or “ready-to-use” robot control approaches such as a joystick or the ROS rqt plugin, a gesture-based approach has the potential to control the robot in a very intuitive way. This strategy allows the operator to focus on the robot instead of the controller. The goal is for a non-robotics expert to be able to remotely navigate the robot depending on the direction in which the operator’s hand is tilted. For example, if the hand is tilted forward (pitch angle), the robot should move forward. If the hand is tilted to the side (roll angle), the robot should rotate. Since the MPU-9250 sensor does not measure the orientation of the operator’s hand, but only the acceleration and the rotations speed of the sensor, the IMU sensor (P19) data must be fused to determine the orientation of the operator’s hand and the tilt angle of the sensor.

We implemented four gestures with different hand motions (see Fig. 13). We used the IMU sensor (P19) to measure the hand gestures needed and map them onto the robot’s linear and angular velocities. The data collected by the IMU sensor is sent via a wireless connection to the robot, which uses it to calculate the movement commands for the motors. We leveraged the framework proposed in [31], [32] to achieve the gesture-based control strategy.

Fig. 13.

Fig. 13

ROMR real-time control and monitoring based on hand movement (a) hand tilt forward the robot moves forward (b) hand tilt backwards the robot moves back (c) hand tilt to the right the robot rotates in a clockwise direction (d) hand tilt to the left the robot rotates in a counter-clockwise direction.

A detailed overview of the hardware and software architecture and the data flow is shown in Fig. 14. The components are divided into a control unit and a robot unit, as the components of these two units are physically located in different places. While the control unit is been attached to the user’s arm/hand, the robot unit is within the ROMR board.

The step-by-step instruction to operate the robot based on gesture demonstration is described as follows:

  • 1.

    At the control unit, upload the “IMUDataNRF24L01_Transmitter.ino” sketch to Arduino, and at the robot unit, upload “NRF24L01Receiver_PC_WheelController_WheelMonitoring.ino” sketch to the Arduino.

  • 2.
    From the host computer, run roscore to start the ROS master, and then:

    graphic file with name fx13.jpg

    to start rosserial python node and establish a connection between the Arduino and the Jetson Nano. Depending on which port the Arduino is connected to the Jetson Nano, the port ttyACM0 must be adjusted accordingly.
  • 3.
    Finally, execute the following script to start the complementary filter node:

    graphic file with name fx14.jpg

    By switching the robot ON, the motors are automatically powered and ready to be controlled with the IMU sensor. Depending on the direction the IMU sensor is tilted, a corresponding movement of the robot is obtained as shown in Fig. 13a-d. During the operation, attention must be paid so that the hand is not tilted about 90°.

7. Validation and characterization

ROMR has been successfully developed, tested and validated both in simulation and in real-world scenarios. Experiments were performed to characterise its performance, robustness and suitability for research, navigation and logistics applications. The evaluation results are presented in this section. Furthermore, the validation video can be viewed at  https://osf.io/ku8ag.

7.1. Simulation scenarios

Although the development of ROMR focused on real-world applications, it is also important to have a 3D simulation model of the robot, to enable users to work in virtual environments to explore tools, techniques and methods. Furthermore, the simulation model could also enable the ROMR users to become familiar with 3D simulation and visualisation tools such as Gazebo [33] and Rviz [34].

The simulation platform includes three parts: the environment model, the robot model, and the sensors model. For the environmental model, we created the floor plan of our laboratory environment using the Gazebo model editor (see Fig. 15a). Taking advantage of the ROS framework [22], the ROMR model was implemented in accordance with the unified robot description format (URDF) [23]. The URDF is an extended mark-up language (XML) format that describes all kinematic and dynamic properties of the robot, the physical elements, such as the links, the joints, the actuators, and the sensors[35].

We generated the URDF of the robot including the sensors model using the 3D CAD models described in Table 4. Further, all the models were verified with several simulation tests as depicted in Fig. 15. The step-by-step procedure for this simulation is as follows:

  • 1.

    Download the ROMR ROS files at https://osf.io/e4syc to your catkin workspace and build it.

  • 2.
    Open three terminal windows, and execute the following in each of the terminals:

    graphic file with name fx15.jpg

    graphic file with name fx16.jpg

    to launch the ROMR world (operational environment model in Gazebo [33]) with the robot spawned and the sensor model active (see Fig. 15a). At the third terminal run the following node to visualise in Rviz (see Fig. 15b).

    graphic file with name fx17.jpg

  • 3.

    Navigate the robot within the operational environment using the ROS rqt plugin, keyboard or the framework described in sub-subSection 6.2.2.

7.2. General system validation test

This section provides information about the robustness of the system by completing different tests with the robot and its sub-components. The outcome of each test is presented in Table 9.

Table 9.

General systems validation test.

Test name Test purpose Test Process Expected Result Outcome
Chassis test To Verify the solidity and stability of the robot’s chassis Complete five different movement tasks at high speed with all the components mounted All the subsystems must be stable and rigid throughout the test Passed
Payload test To Verify the maximum load capacity the robot can carry without affecting the controller Place different loads on the robot and check the response The robot should be able to convey the load up to the maximum capacity Carried up to 90 kg (see Fig. 16)
Reconfiguration test To Verify how long and easy to dismantle and re-assemble the system Dismantle all the subsystems including the chassis and wiring and re-assemble them. Record the time taken to complete the process Should not take more than 3 h Took about 1 h 15 min
Battery live test To evaluate how long the battery can power the robot for a long mission task Operate the robot continuously with all the electronics parts active for a long period of time The battery should last up to the maximum capacity The battery lasted for about 8 h
ROMR stability test To verify the stability of the robot when driven at high speed and at small turning radii Drive the robot forward and backwards, at high speed, and in a circular path of different radii. Record the linear and angular velocity data The robot should be stable throughout the stability test process See SubSection 7.2.2 and Fig. 17 for the results

7.2.1. Payload capacity test

In Fig. 16, the result of the payload capacity test carried out as described in Table 9 is presented. The robot is teleoperated to move along linear and angular trajectories with different loads ranging from the robot weight only (17.1 kg) to 90 kg. The goal is to verify how much load the robot can carry without affecting the controller. Although the ROMR can carry a load up to a maximum of 90 kg, it is, however, not recommended to operate it at maximum load continuously to extend its life span. All the various load tests were carried out on the robot while moving on a flat surface. Thus, we did not evaluate the performance on irregular or unstructured surfaces.

Fig. 16.

Fig. 16

Validation of the maximum load capacity of the ROMR. Shown are (a) ROMR weight only (17.1kg), (b) 16kg, (c) 25kg, (d) 85kg, and (e) 90kg.

7.2.2. Stability test

We performed this test to evaluate the stability of the robot under high speed and different turning radii. We tested the robot stability in both Gazebo simulation and in real-world at four different speeds: 0.5m/s,1.0m/s,1.5m/s, and 2.5m/s. For each speed, we tested the robot at different turning radii (0.5m,1.0m,1.5m,2.0m, and 2.5m), three different payloads (ROMR weight, 25kg, and 85kg), and three different positions of the centre of gravity (-0.1m,0.0m, and 0.1m). Note, the positions of the centre of gravity (pCOG) are defined relative to the midpoint on the ROMR’s base frame. The centre of gravity (COG) is slightly altered by adjusting the loads (25 kg and 85 kg) at positions 0.1m (front), -0.1m (behind), and 0.0m (COG unaltered) from the midpoint of the ROMR base frame. The results are summarized in Table 10.

Table 10.

Minimum turning radius of the robot for different linear velocities, payload weights, and positions of the centre of gravity (pCOG).

Linear Vel. (m/s) Payload (kg) pCOG (m) Turning radius (m) Stability
0.5 ROMR weight (17.1) 0.0 0.5 stable
0.5 25 0.1 1.0 stable
0.5 85 - 0.1 1.5 stable



1.0 ROMR weight (17.1) 0.0 2.0 stable
1.0 25 0.1 2.5 stable
1.0 85 - 0.1 1.5 stable



1.5 25 - 0.1 1.0 stable
1.5 85 0.1 2.5 stable
1.5 ROMR weight (17.1) 0.0 1.5 stable



2.5 25 - 0.1 1.5 stable
2.5 85 0.1 2.5 stable
2.5 ROMR weight (17.1) 0.0 0.5 unstable

Fig. 17 shows the results of the stability test. The robot was driven in a circular path at different linear velocities, turning radii, payload weights, and pCOG. The odometry, command velocity and IMU data were recorded in a rosbag file used for our analysis. At the lowest speed of 0.5m/s, the robot showed some minor oscillations when turning at the smallest turning radius of 0.5m. However, these oscillations were not significant enough to cause the robot to lose control or become unstable. At the higher velocities of 1.0m/s and 1.5m/s, the robot remained stable even when turning at the smallest radius of 0.5m. The robot was stable up to a linear velocity of 2.5m/s. At 2.5m/s, the robot started to show signs of instability, such as tilting and sliding. Therefore, it can be concluded that the maximum stable linear velocity of the robot is 2.5m/s.

Fig. 17.

Fig. 17

The ROMR stability test. The robot was controlled to follow circular paths at different linear velocities, turning radii, payload weights, and positions of the centre of gravity. As shown in the sub-figures c - f, the robot was unstable at about 215 s when the linear velocity increased to 2.5m/s (see the red rectangles). (a) depicts the robot’s trajectory as it follows the circular path. (b) shows the robot’s position on the x (red) and y (blue) axes at each time stamp. (c) shows the linear velocity in m/s (blue) and the roll angle in radians (dark red) at each time stamp. (d) shows the angular velocity of the robot along the x (blue), y (green) and z (red) axes at each time stamp. (e) is the orientation at x (red) and y (yellow) axes respectively. (f) is the linear acceleration in x (green) and y (cyan) axes.

To conduct the stability test, the following steps should be taken:

  • 1.
    Launch the robot in the Gazebo environment:

    graphic file with name fx18.jpg

  • 2.
    The following nodes are needed only when launching the robot in the real-world. For the Gazebo simulation, they are not necessary.

    graphic file with name fx19.jpg

    graphic file with name fx20.jpg

  • 3.
    Start the stability test by launching the following node. The robot trajectory and a CSV file containing the necessary data for further analysis will be generated at the end of the test.

    graphic file with name fx21.jpg

  • 4.
    If needed, record the IMU, the odometry, and the velocity data in a rosbag file for further analysis:

    graphic file with name fx22.jpg

A video showing the result of the above test can be found at  https://osf.io/wcd4n.

7.3. Application of the ROMR for SLAM

As stated earlier, ROMR provides a framework for evaluating and developing SLAM algorithms. The SLAM problem is usually to build a map of an unknown environment, i.e, mapping while simultaneously keeping track of the estimates of a robot’s pose (the position x, y and the orientation). Given a series of control and sensor observations [19], [36], the map is built and the pose is estimated. We evaluated the Hector-SLAM algorithm [37] on our RPlidar sensor (P25) to build the map of our real laboratory environment. The adaptive Monte Carlo localization (AMCL) [38] approach was used to localise the robot within the built map. The advantage of the Hector-SLAM technique over other 2D SLAM techniques such as Gmapping [39], and Google Cartographer [40] is that it only requires laser scan data and does not need odometry data to build the map. To generate the map, the following steps have to be followed:

  • 1.

    Set up the RPlidar as described in sub-subSection 6.1.5.

  • 2.

    Download or clone the Hector-SLAM packages at  https://github.com/tu-darmstadt-ros-pkg/hector_slam.git to your ROS workspace, and set the coordinate frame parameters according to the instruction at the ROS wiki page  http://wiki.ros.org/hector_slam. Build the ROS workspace including the Hector-SLAM and RPlidar packages (catkin_make), then proceed to the next step.

  • 3.
    Open four terminal windows, and run the following in each of the terminal windows:

    graphic file with name fx23.jpg

    graphic file with name fx24.jpg

    graphic file with name fx25.jpg

    While all the nodes are running, navigate the robot around the environment by employing any of the control approaches implemented in sub-subSections 6.2.1, 6.2.2, 6.2.3. While building the map, it is recommended to move at a low speed such that a quality map is created.

  • 4.
    After the mapping is completed, execute the following at the fourth terminal:

    graphic file with name fx26.jpg

    Take note of the location where the map is saved. It would be required for localisation and autonomous navigation. The saved map can be viewed on your screen by running:

    graphic file with name fx27.jpg

Fig. 18 shows the map built in the virtual laboratory environment with the ROMR. Fig. 19a shows the map built in the real-laboratory world. For the localisation of the robot, the employed AMCL approach uses a particle filter to track the pose of the robot [41]. It maintains a probability distribution over a set of all the possible robot poses [38], and updates this distribution using the data from the ROMR odometry and rplidar scan (P25). Fig. 19b shows the localisation of the ROMR within the 2D occupancy grid map, where the dark green clusters denote the AMCL particles representing the estimates of the location of the robot.

Fig. 18.

Fig. 18

Generation of the 2D occupancy grid map of the environment using the 360° lidar sensor (P25) and a “Hector-SLAM” algorithm [37]. (a) Virtual laboratory world in Gazebo. The blue lines represent the lidar scan. (b) Occupancy grid map of the operational environment in Rviz. The pale grey areas indicate the unoccupied (free) spaces that the robot can navigate, the black lines represent occupied areas not transversal by the robot, and the green line represents the robot’s trajectory.

Fig. 19.

Fig. 19

Applying ROMR to a real-world SLAM problem. (a) Generated the 2D occupancy grid map of the operational environment with the P25 lidar and the Hector-SLAM algorithm. The robot’s trajectory is represented with the green line. (b) Localising the ROMR within the map with the Adaptive Monte Carlo Localisation (AMCL) algorithm. The dark green clusters are the AMCL particles that represent the estimates of the location of the robot.

7.4. Proposed maintenance for the ROMR

Finally, to increase the life span of ROMR, predictive and corrective maintenance is necessary. This aims to maintain or repair the robot to ensure that the robot works at its maximum efficiency. Predictive maintenance such as listening to any abnormal noise; performing a visual inspection of all the parts of the robot; checking the energy storage level; checking for vibrations, mechanical defects, improper connections, calibration errors, etc., are proposed before each operation. This is intended to reduce the probability of failure during operation. Furthermore, curative maintenance on the other hand should be performed after detecting a failure.

8. Conclusion

In this paper, we presented a ROS-based open-source mobile robot ROMR for research and industrial applications. We provided detailed information about the hardware design, the architecture, the operation instructions, and the advantages it offers compared to the commercial platforms. The entire design utilises off-the-shelf electronic components, additive manufacturing technologies and aluminium profiles that are commercially available to speed up the re-prototyping of the framework for custom or general-purpose applications. We implemented several control techniques that can enable a non-robotics expert to operate the robot easily and intuitively. Furthermore, we demonstrated the applicability of the ROMR for logistics problems by implementing navigation, simultaneous localisation and mapping (SLAM) algorithms, which are fundamental prerequisites for autonomous robots. The experimental validation of the robustness and performance is illustrated in the video  https://osf.io/ku8ag. Future work will focus on porting the whole platform to ROS 2. As an open-source platform, the scientific community has been granted permission to use all the design files published at https://doi.org/10.17605/OSF.IO/K83X7. This open-source strategy supports rapid progress in the development of intelligent mobile robots and dexterous systems.

CRediT authorship contribution statement

Linus Nwankwo: Conceptualization, Software, Writing - original draft. Clemens Fritze: Software, Writing – review & editing. Konrad Bartsch: Construction and CAD design. Elmar Rueckert: Supervision, Validation, Writing – review & editing.

Declaration of Competing Interest

The authors declare the following financial interests/personal relationships which may be considered as potential competing interests: This project has received funding from the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) No #430054590 (TRAIN).

Acknowledgements

This project has received funding from the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) No #430054590 (TRAIN).

Biographies

graphic file with name fx1.jpg

Linus Nwankwo is pursuing his PhD in robotics at the Chair of Cyber-Physical Systems (CPS), Montanuniversität Leoben, Austria. Prior to joining CPS as a PhD student, he worked as a research intern at the Department of Electrical and Computer Engineering, Technische Universität Kaiserslautern, Germany. In 2020, he obtained his Master of Science (M.Sc.) degree in Automation and Robotics, a speciality in control for Green Mechatronics (GreeM) at the University of Bourgogne Franche-Comte (UBFC), France. His research interests include robot dynamics modelling, sensor fusion, simultaneous localization & mapping (SLAM), and path planning.

graphic file with name fx2.jpg

Clemens Fritze is a master’s student in Mechatronics at the Johannes Kepler Universität Linz. Prior to his master’s program, he studied Mechanical Engineering for his bachelor’s degree at the Montanuniversität Leoben, Austria. His research interest is in robotics.

graphic file with name fx3.jpg

Konrad Bartsch is a senior technician at the chair of Cyber-Physical-Systems, Montanuniversität Leoben, Austria. He studied Building Electronics and Mechanical Engineering at Höhere Technische Bundeslehranstalt Graz-Gösting, Austria. His research interests include cyberphysical systems, modern technologies, machine learning and robotics.

graphic file with name fx4.jpg

Elmar Rueckert has been the chair of Cyber-Physical-Systems Institute at the Montanuniversität Leoben, Austria since March 2021. He received his PhD in computer science at the Graz University of Technology in 2014. He worked for four years as a senior researcher and research group leader at the Technical University of Darmstadt. Thereafter, he worked for three years as an assistant professor at the University of Lübeck. His research interests include stochastic machine and deep learning, robotics and reinforcement learning and human motor control. For more information visit https://cps.unileoben.ac.at/univ-prof-dr-elmar-rueckert/.

References

  • 1.R. Sell, A. Rassõlkin, R. Wang, T. Otto, Integration of autonomous vehicles and Industry 4.0, Proceedings of the Estonian Academy of Sciences, vol. 68, pp. 389–394, 12 2019, DOI: 10.3176/proc.2019.4.07, URL: http://vana.kirj.ee/public/proceedings_pdf/2019/issue_4/proc-2019-4-389-394.pdf.
  • 2.Hancock P.A., Nourbakhsh I., Stewart J. On the future of transportation in an era of automated and autonomous vehicles. Proc. Nat. Acad. Sci. 2019;116(16):7684–7691. doi: 10.1073/pnas.1805770115. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Cupek R., Drewniak M., Fojcik M., Kyrkjebø E., Lin J.C.-W., Mrozek D., Øvsthus K., Ziebinski A. In: Computational Science – ICCS 2020. Krzhizhanovskaya V.V., Závodszky G., Lees M.H., Dongarra J.J., Sloot P.M.A., Brissos S., Teixeira J., editors. Springer International Publishing; Cham: 2020. Autonomous Guided Vehicles for Smart Industries – The State-of-the-Art and Research Challenges; pp. 330–343. [Google Scholar]
  • 4.A. Markis, M. Papa, D. Kaselautzke, M. Rathmair, V. Sattinger, M. Brandstötter, Safety of Mobile Robot Systems in Industrial Applications,, 05 2019, 10.3217/978-3-85125-663-5-00, URL: https://www.researchgate.net/publication/337339431_Safety_of_Mobile_Robot_Systems_in_Industrial_Applications.
  • 5.Fragapane G., de Koster R., Sgarbossa F., Strandhagen J.O. Planning and control of autonomous mobile robots for intralogistics: Literature review and research agenda. Eur. J. Oper. Res. 2021;294(2):405–426. doi: 10.1016/j.ejor.2021.01.019. [DOI] [Google Scholar]
  • 6.Zhong J., Ling C., Cangelosi A., Lotfi A., Liu X. On the Gap between Domestic Robotic Applications and Computational Intelligence. Electronics. 03 2021,;10:793. doi: 10.3390/electronics10070793. URL:  https://www.mdpi.com/2079-9292/10/7/793. [DOI] [Google Scholar]
  • 7.Palacín J., Rubies E., Clotet E. The Assistant Personal Robot Project: From the APR-01 to the APR-02 Mobile Robot Prototypes. Designs. 07 2022,;6:66. doi: 10.3390/designs6040066. URL:  https://www.mdpi.com/2411-9660/6/4/66. [DOI] [Google Scholar]
  • 8.H. Unger, T. Markert, E. Müller, Evaluation of use cases of autonomous mobile robots in factory environments, Procedia Manufacturing, vol. 17, pp. 254–261, 2018, doi: 10.1016/j.promfg.2018.10.044. 28th International Conference on Flexible Automation and Intelligent Manufacturing (FAIM2018), June 11-14, 2018, Columbus, OH, USAGlobal Integration of Intelligent Manufacturing and Smart Industry for Good of Humanity.
  • 9.V.M. Pawar, J. Law, C. Maple, Manufacturing Robotics: The Next Robotic Industrial Revolution,, 2016, URL: https://www.ukras.org.uk/wp-content/uploads/2021/01/UKRASWP_ManufacturingRobotics2016_online.pdf.
  • 10.R.D. Atkinson, Robotics and the Future of Production and Work, 2019.
  • 11.Pagliarini L., Lund H. The future of Robotics Technology. J. Robot. Networking Artif. Life. 2017;3:270. doi: 10.2991/jrnal.2017.3.4.12. URL:  https://www.researchgate.net/publication/315986147_The_future_of_Robotics_Technology. [DOI] [Google Scholar]
  • 12.Fragapane G.I., Ivanov D.A., Peron M., Sgarbossa F., Strandhagen J.O. Increasing flexibility and productivity in Industry 4.0 production networks with autonomous mobile robots and smart intralogistics. Ann. Oper. Res. 2022;308:125–143. doi: 10.1007/s10479-020-03526-7.pdf. [DOI] [Google Scholar]
  • 13.Grimminger F., Flayols T., Fiene J., Badri-Spröwitz A., Righetti L., Meduri A., Khadiv M., Viereck J., Wuthrich M., Naveau M., Berenz V., Heim S., Widmaier F. An Open Torque-Controlled Modular Robot Architecture for Legged Locomotion Research. IEEE Robot. Autom. Lett. 02 2020,;PP doi: 10.1109/LRA.2020.2976639. 1–1. URL: https://arxiv.org/abs/1910.00093. [DOI] [Google Scholar]
  • 14.Hui N.B., Pratihar D.K. Springer Berlin Heidelberg; Berlin, Heidelberg: 2010. Design and Development of Intelligent Autonomous Robots; pp. 29–56. [DOI] [Google Scholar]
  • 15.Betancur-Vásquez D., Mejia-Herrera M., Botero-Valencia J. Open source and open hardware mobile robot for developing applications in education and research. HardwareX. 2021;10 doi: 10.1016/j.ohx.2021.e00217. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.W. Jo, J. Kim, R. Wang, J. Pan, R.K. Senthilkumaran, B.-C. Min, SMARTmBOT: A ROS2-based Low-cost and Open-source Mobile Robot Platform, arXiv preprint arXiv:220308903, 2022, URL: https://doi.org/10.48550/arXiv.2203.08903.
  • 17.Prinz R., Bulbul R., Scholz J., Eder M., Steinbauer-Wagner G. Off-Road Navigation Maps for Robotic Platforms using Convolutional Neural Networks. AGILE: GIScience Ser. 2022;3:55. doi: 10.5194/agile-giss-3-55-2022. URL:  https://agile-giss.copernicus.org/articles/3/55/2022/ [DOI] [Google Scholar]
  • 18.S. Kolski, Mobile Robots: Perception & Navigation, IntechOpen, London, United Kingdom, Feb 2007, 10.5772/36, URL: https://doi.org/10.5772/36.
  • 19.Frese U., Wagner R., Röfer T. A SLAM overview from a users perspective. KI. 09 2010,;24:191–198. doi: 10.1007/s13218-010-0040-4. [DOI] [Google Scholar]
  • 20.Thrun S. Probabilistic Algorithms in Robotics. AI Mag. 2000;21:07. [Google Scholar]
  • 21.J. Shabbir, T. Anwer, A survey of deep learning techniques for mobile robot applications, arXiv preprint arXiv:180307608, 2018, URL: https://arxiv.org/abs/1803.07608.
  • 22.M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A. Ng, ROS: an open-source Robot Operating System, vol. 3, 01 2009, URL: http://robotics.stanford.edu/ang/papers/icraoss09-ROS.pdf.
  • 23.wiki.ros.org, Learning URDF Step by Step, ROS Wiki, URL: http://wiki.ros.org/urdf/Tutorials, available online.
  • 24.odriverobotics.com, Getting Started, Wikipedia, URL: https://docs.odriverobotics.com/v/0.5.4/getting-started.html, available online at.
  • 25.odriverobotics.com, ODrive Pro Documentation, Wikipedia, URL: https://docs.odriverobotics.com/v/latest/ground-loops.html, available online at.
  • 26.Coviello G., Avitabile G. Multiple Synchronized Inertial Measurement Unit Sensor Boards Platform for Activity Monitoring. IEEE Sens. J. 03 2020,;PP doi: 10.1109/JSEN.2020.2982744. 1–1. URL:  https://www.researchgate.net/publication/340110168_Multiple_Synchronized_Inertial_Measurement_Unit_Sensor_Boards_Platform_for_Activity_Monitoring. [DOI] [Google Scholar]
  • 27.lastminuteengineers.com, How nRF24L01+ Wireless Module Works & Interface with Arduino, URL: https://lastminuteengineers.com/nrf24l01-arduino-wireless-communication/.
  • 28.D. Madison, How to Use an RC Controller with an Arduino, Wikipedia, URL: https://www.partsnotincluded.com/how-to-use-an-rc-controller-with-an-arduino/, available online, August 27, 2020.
  • 29.L. Kaul, Hoverboard motors turned into an RC skater, Wikipedia, URL: https://blog.arduino.cc/2019/06/10/hoverboard-motors-turned-into-an-rc-skater/, available online, June 10, 2019.
  • 30.N. Rottmann, N. Studt, F. Ernst, E. Rueckert, ROS-Mobile: An Android application for the Robot Operating System, arXiv preprint arXiv:201102781, 2020.
  • 31.Fu G., Azimi E., Kazanzides P. Mobile Teleoperation: Feasibility of Wireless Wearable Sensing of the Operator’s Arm Motion. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) January 2022 doi: 10.1109/IROS51168.2021.9636838. [DOI] [Google Scholar]
  • 32.S. Li, J. Jiang, P. Ruppel, H. Liang, X. Ma, N. Hendrich, F. Sun, J. Zhang, A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU, ArXiv, 03 2020, URL: https://doi.org/10.48550/arXiv.2003.05212.
  • 33.gazebosim.org, Simulate before you build, Gazebosim Wiki, URL: https://gazebosim.org/home, available online.
  • 34.wiki.ros.org, 3D visualization tool for ROS, Wikipedia, URL: http://wiki.ros.org/rviz, available online August 16, 2022.
  • 35.F. Sanfilippo, O. Stavdahl, P. Liljeback, SnakeSIM: A ROS-based rapid-prototyping framework for perception-driven obstacle-aided locomotion of snake robots, in: 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1226–1231, 2017. DOI: 10.1109/ROBIO.2017.8324585.
  • 36.E. Rueckert, Simultaneous localisation and mapping for mobile robots with recent sensor technologies, Dissertação de Mestrado, Technical University Graz, 2010, URL: https://cps.unileoben.ac.at/wp/MScThesis2009Rueckert.pdf, Article File.
  • 37.S. Kohlbrecher, J. Meyer, T. Graber, K. Petersen, U. Klingauf, O. Von Stryk, Hector Open Source Modules for Autonomous Mapping and Navigation with Rescue Robots, pp. 624–631, 01 2014, DOI: 10.1007/978-3-662-44468-9_58.
  • 38.D. Fox, W. Burgard, F. Dellaert, S. Thrun, Monte Carlo Localization: Efficient Position Estimation for Mobile Robots, pp. 343–349, 01 1999, URL: http://robots.stanford.edu/papers/fox.aaai99.pdf.
  • 39.Grisetti G., Stachniss C., Burgard W. Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters. IEEE Trans. Rob. 2007;23(1):34–46. doi: 10.1109/TRO.2006.889486. URL:  https://ieeexplore.ieee.org/document/4084563. [DOI] [Google Scholar]
  • 40.W. Hess, D. Kohler, H. Rapp, D. Andor, Real-Time Loop Closure in 2D LIDAR SLAM, in: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1271–1278, 2016, URL: https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45466.pdf.
  • 41.Rekleitis I. A particle filter tutorial for mobile robot localization. Wikipedia. 2004 URL:  https://www.cim.mcgill.ca/yiannis/particletutorial.pdf. [Google Scholar]

Articles from HardwareX are provided here courtesy of Elsevier

RESOURCES