Developing your code
On first run, the robot will execute an example program for convenience. This program will be copied to the directory
competition-simulator-<version> is stored in:
. ├── competition-simulator-<version> │ ├── ... │ └─ worlds │ └── Arena.wbt └── robot.py
Your code should be developed in
There is a pre-built robot used in the simulator. To allow this simulated robot to move around and sense its environment a set of motors and sensors have been connected as detailed below.
The simulator’s API is very similar to the real SR API described in the programming docs. The main differences are the way that time is handled, some discrepancies in the vision API we hope to resolve soon and the simulated robot not having the Brain Board LEDs.
Your robot has one motor board attached, the left wheel is connected to the first port, and the right wheel to the second.
The motor board has the part code
srABC1, since only a single motor board is attached it can be referenced as
Your robot has one servo board attached, the jaws of the robot are controlled by a pair of servos:
Setting each servo to -1 fully opens the respective jaw.
The servo board has the part code
srXYZ2, but since only a single servo board is attached it can be referenced as
Your robot has a microswitch and six distance sensors, attached to the digital and analogue pins respectively. These are all attached to a single ruggeduino.
Because these sensors are pre-attached to the ruggeduino, you do not need to set its
The microswitch is attached to digital pin 2:
This is shown as a red coloured block on the robot. Using the
digital_read method, you’ll receive a
bool telling you whether the switch is currently actuated.
Analogous to ultrasound sensors, distance sensors allow you to retrieve the distance between your robot and an object. These are attached to analogue pins A0-A5:
These are shown as blue boards with silver transceivers on the robot. The
analogue_read method will return the distance in metres. They can see in a narrow cone up to a maximum of about 2m away.
Since these sensors rely on echoes being reflected back from objects, if the angle of incidence between the sensor’s pulse and the contacted surface exceeds 22.5 degrees then the sensor will be unable to detect the object.
The LEDs are attached to digital pins 3-4:
digital_write method, you can set these to True (On) or False (Off).
The simulated robot has a camera which provides position and orientation information about other objects within the simulation. This simulates the system of fiducial markers which the physical robot’s camera can detect.
The information returned by the simulated vision API is generally in the same format and units as the physical robot’s vision API.
In the simulated environment, time advances only at the pace that the simulator
is run. As a result, using
time.time to know how long your robot has been
running for or
time.sleep to wait for some duration will be unreliable.
As a result the API present in the simulator supports a slightly different
approach to handling time.
R.sleep are provided as a direct replacement of
time.sleep respectively and can be used anywhere the previous methods were used.
R.sleepeven if with a small value. If in doubt add an
R.sleep. If you find that the simulator freezes then this indicates that your code is reaching a loop which does not contain any
R.sleepand is expecting time to advance.