Implementing your own robot

At the moment only one robot - Khepera3 - is implemented in pySimiam. Most likely, you will need to customize it for your needs or implement a completely new robot.

A robot from the point of view of the simulator is an drawable object that accepts control inputs from a supervisor. The pair robot-supervisor is a tightly bound entity, and the simulator plays only a messenger role between them.

The robot is a SimObject

Start with subclassing robot.Robot and implementing drawing. Two functions have to be implemented: draw() that draws the robot on screen and get_envelope() that returns the bounding rectangle of the robot for collision detection.

The get_envelope() method should return a list of points describing the polygon. By default this polygon is only used for collision detection, and so it does not have to correspond exactly to the drawing of the robot.

The draw() method accepts a Renderer object as parameter, and can use any of its functions to draw the robot. The drawing can be as complicated as you wish, or as easy as:

def draw(self,r):
    """Draw the envelope (shape) filling it with the internal color."""
    r.set_pose(self.get_pose())
    r.set_brush(self.get_color())
    r.draw_polygon(self.get_envelope())

Note the use of set_pose() method. It is much easier to draw the robot in the robot’s internal reference frame, so it is recommended to use this method before drawing. It is also recommended to use the color of the robot set_pose() during drawing, to be able to distinguish different robots.

If you robot has sensors that can be drawn, an additional method draw_sensors() can be implemented. We will revisit this point later, when we talk about sensors.

The robot moves

To move the robot, the simulator calls move() with the time interval as a parameter. When implementing this method, you should update the pose of the robot with set_pose(), depending on its internal state.

The internal state of the robot is set by the supervisor passing parameters into set_inputs(). The format of the parameters is up to the robot implementation, and the supervisor has to conform to this. Please, provide sufficient documentation when implementing this method.

Information about the robot

Some of the robot’s parameters are important to understand its state. For example, in case of a differential drive robot, the radius of the wheels and the distance between them is important. There should be also a way to obtain sensor readings. This information is expected to be provided by the get_info() method. As with set_inputs(), the actual structure of the returned object is up to the robot and only matters for the supervisor. You can turn to the Khepera3 code for inspiration.

One thing that is important for the simulator, though, is the set of the external sensors of the robot, as the sensors have to interact with the world. This information should be returned as a list of sensors from the get_external_sensors() method.

For example, if your robot has an IR sensor skirt with five sensors, the list should contain these sensor objects (see sensor.ProximitySensor). You should also implement draw_sensors() in this case, which can be as simple as:

def draw_sensors(self,renderer):
    """Draw the sensors that this robot has"""
    for sensor in self.ir_sensors:
        sensor.draw(renderer)

Note

At the moment, the only type of sensors that are supported by the simulator are proximity sensors, such as ultrasound and IR sensors. Please contact the developers (or extend the simulator yourself) if you need another kind of external sensors.

Testing

Your robot should be ready now. How can you test it? You need two parts - a world and a supervisor . In the beginning, neither of the two has to be very complicated. A simple world can just contain one robot, and your supervisor can work without any controllers. This should be enough to test the drawing, positioning of the sensors and the dynamics.

API

class pose.Pose(*args, **kwargs)[source]

The pose class allows for a posing of objects in 2D space. The pose uses a right-hand coordinate system with counter-clockwise measurement of theta from the x-axis

There are several ways to create a pose:

Pose(x,y,theta) A pose at x,y and orientation theta
Pose(x,y) Same as Pose(x,y,0)
Pose() Same as Pose(0,0,0)
Pose([x,y,theta]) Same as Pose(x,y,theta)

There are several ways to access pose parameters:

x, y, theta = pose
x, y, theta = pose.get_list()
x = pose.x; y = pose.y; theta = pose.theta
Pose.get_list()[source]

Get the pose as a list [x, y, theta]. Equivalent to list(pose).

Pose.get_transformation()[source]

Get the 3x3 transformation matrix associated with the pose.

Pose.iscloseto(other, epsilon)[source]

Compare this pose to other. Returns True if the relative distance in x, y and theta is smaller than epsilon

Pose.set_pose(*args, **kwargs)[source]

Set all or some pose parameters.

Possible arguments are:

set_pose(x, y, theta) Set all of x, y and theta
set_pose(another_pose) Use x, y and theta from another pose
set_pose(x = 3.0) Only change the x position
set_pose(theta = pi, y = 3.0) Only change the y position and orientation
set_pose(another_pose, y = 1) Use x and theta from another pose, use y=1
class simobject.SimObject(pose, color=0)[source]

The base class for all objects that can be drawn in the simulator. Every SimObject has a pose, an envelope and a color.

Parameters:
  • pose (Pose) – The position of the object.
  • color (int) – The internal color of the object (0xAARRGGBB or 0xRRGGBB). The default color is black.
SimObject.get_color()[source]

Get the internal color of the object

SimObject.set_color(color)[source]

Set the internal color of the object

SimObject.get_pose()[source]

Get the pose of the object in world coordinates

SimObject.set_pose(pose)[source]

Set the pose of the object in world coordinates

SimObject.draw(renderer)[source]

Draws the object using renderer (see Renderer).

The object doesn’t have to use only one color. It doesn’t even have to use its internal color while drawing.

SimObject.get_envelope()[source]

Get the envelope of the object in object’s local coordinates.

The envelope is a list of xy pairs, describing the shape of the bounding polygon.

SimObject.get_world_envelope(recalculate=False)[source]

Get the envelope of the object in world coordinates. Used for checking collision.

The envelope is cached, and will be recalculated if recalculate is True.

SimObject.get_bounding_rect()[source]

Get the smallest rectangle that contains the object as a tuple (x, y, width, height).

SimObject.has_collision(other)[source]

Check if the object has collided with other. Return True or False

SimObject.get_contact_points(other)[source]

Get a list of contact points with other object. Returns a list of (x, y)

SimObject.get_bounds()[source]

Get the smallest rectangle that contains the object as a tuple (xmin, ymin, xmax, ymax)

class simobject.Polygon(pose, shape, color)[source]

The polygon is a simobject that gets the envelope supplied at construction. It draws itself as a filled polygon.

Parameters:
  • pose (Pose) – The position of the polygon.
  • shape (list((int,int))) – The list of points making up the polygon.
  • color (int) – The color of the polygon (0xAARRGGBB or 0xRRGGBB).
Polygon.draw(r)[source]

Draw the envelope (shape) filling it with the internal color.

class simobject.Cloud(color)

The cloud is a collection of points.

Cloud.add_point(pose)

Append a point at pose to the collection. The orientation of the pose is ignored

Cloud.draw(r)

Draw a polyline with modes at all added points, using the internal color

class simobject.Path(start, color)[source]

The path is a simobject that draws itself as a polyline. The line starts at start, and can be continued by adding points using add_point().

Parameters:
  • start (Pose) – The starting point of the polyline in world coordinates.
  • color (int) – The color of the line (0xAARRGGBB or 0xRRGGBB).

The path is used by the simulator to track the history of robot motion

Path.reset(start)[source]

Set the start point to start.x and start.y and remove all other points

Path.draw(r)[source]

Draw a polyline with modes at all added points, using the internal color

class robot.RealBot(pose, color=0)

This type of robots implements communication with a real-world robot.

Although this is a SimObject, it doesn’t move by itself. Use set_pose() to move the robot.

RealBot.pause()

Stops the robot, saving the state

RealBot.resume()

Restarts the robot from the saved state

RealBot.update_external_info()

Initiate communication with the real robot and get state info back.

class robot.Robot(pose, color=0)[source]

The robot is a SimObject that implements drawing and information functions to interface with supervisor.

This class is not intended to be subclassed in user code. Use one of the provided subclasses instead: SimBot for emulated robots or RealBot for physical robots.

Robot.draw_sensors(renderer)[source]

Draw the sensors that this robot has

Robot.get_info()[source]

Return the robot information structure, including sensor readings and shape information

Robot.set_inputs(inputs)[source]

Set drive inputs in the format needed by this robot

class robot.SimBot(pose, color=0)

The robot defined by this class is a simulated robot, and implements its own motion in move().

To implement a new type of robot, subclass SimBot and implement get_info() and get_external_sensors().

To make your robot move, implement move().

To make you robot controllable, implement set_inputs().

If your robot has sensors that can be drawn in the view, implement draw_sensors().

SimBot.get_external_sensors()

Get the external sensors of the robot as a list. This function is used to update the sensor readings in proximity sensors.

SimBot.move(dt)

Move the robot for a time interval dt.

class sensor.Sensor[source]

Base superclass for sensor objects

classmethod Sensor.add_gauss_noise(value, sigma)[source]

Returns the value with an added normal noise

The return value is normally distributed around value with a standard deviation sigma

class sensor.MountedSensor(pose, parent)[source]

A sensor that moves together with its parent object.

The sensor is assumed to be attached to parent at pose in local coordinates.

MountedSensor.get_internal_pose()[source]

Get the pose of the sensor in the parent (robot) coordinates.

class sensor.ProximitySensor(pose, robot, geometry)[source]

Create a proximity sensor mounted on robot at pose. The geometry is a (rmin, rmax, angle) tuple.

ProximitySensor.get_envelope()[source]

Return the envelope of the sensor

ProximitySensor.distance_to_value(dst)[source]

Returns the distance to the value using sensor calculations

ProximitySensor.distance()[source]

Returns the distance instance

ProximitySensor.reading()[source]

Returns the reading value

ProximitySensor.update_distance(sim_object=None)[source]

updates all the distances from the reading

ProximitySensor.draw(r)[source]

draws the sensor simobject

ProximitySensor.get_distance_to(sim_object)[source]

Gets the distance to another simobject returns distance in meters or None if not in contact

Table Of Contents

Previous topic

World files

Next topic

Writing a supervisor

This Page