Simulated robot arms are non-solid bodies

Hi,

I noticed that in the simulation the arms (but not the arm tips) of the robot are not actually solid bodies - i.e. they are able to pass through each other and the cube is able to pass through the arms. Unfortunately the policy I have learnt in simulation exploits this inaccuracy and so is not transferring very well to the real robot - the arms collide with the cube and one another when the policy does not expect them to.

I’m just posting this on the off chance there is a quick/easy fix to this issue - i.e. are you aware of any way to make the arms behave as if they are solid bodies? No worries if not, thanks.

Can you post a screenshot or maybe video showing the erroneous behaviour? This would help to better understand the circumstances when this happens.

In general the cube and all parts of the robot should be collision objects (I did a quick test to verify this). However, there is a known issue that when dragging the fingers around with the mouse cursor, it is possible to get them stuck inside the arena boundary. I don’t think I ever saw that happening when the robot was moving on it’s own (i.e. by sending commands to the motors instead of dragging with the mouse), but maybe it can happen under certain circumstances.

Unfortunately, we don’t have a fix for this so far :frowning:. I’m also not sure if this is a problem with our models or a more general problem of PyBullet.

This video shows the issues (let me know if access to link isn’t working).

In this extreme case the policy doesn’t have much use for the 3rd arm so often leaves it in the middle and lifts the cube straight through it! However, there are also more subtle cases where collisions do not occur when they should, hurting transfer to the real robot.

On another note, you can see in the video that when the tip is contact with the cube it is actually hovering slightly above the cube surface. I think this means the simulated cube is effectively slightly larger than the real cube (i.e. has a larger width). I think this also hurting our policy transfer. To rectify this, or to change the cube dimensions for domain randomization purposes, would I need to apply changes to this cube_v2.obj file?

A good way to overcome these issues would probably be to train with data from the real robot, but unfortunately we have yet to complete a system which can do so and are pretty short on time.

Thanks for the video, this really doesn’t look good :confused:. I’ll look into it.

I just remembered something: I think for the robot links, PyBullet does not use the exact shape but the convex hull for collision detection. Since the links have some small concavities, this can be noticeable in some situations (i.e. the simulation detecting a collision a bit before they actually touch). However, this does not explain the intersection of objects that we have here.

I am not sure what is causing the visible gap between cube and finger tip in the video. The cube should have the same size in simulation and reality (with maybe some inaccuracy < 1mm). However, the tips of the real robot are soft, which is not modeled in the simulation. This means the real robot can push a bit into the object.
In any case, to change the object size in simulation you can indeed either modify the cube_v2.obj or alternatively you should be able to rescale it in the corresponding cube_v2.urdf (example).

Ok, thank you.

Is there a way to change the cube_v2.urdf file contained within the singularity image or do you recommend I pull the trifinger_simulation code and change it manually?

You can convert the image to a sandbox (this basically extracts the sif-file into a directory structure where you can edit files before running the image).

Alternatively you could add a copy of trifinger_simulation to your repository and modify it there.

Thanks, I’ll try add a copy of trifinger_simulation (presumably the ‘real_robot_challenge_2021’ branch is best) and modify.

Regarding the issue in the video with the lack of collisions - is there a chance that something I have changed on my end is causing the problem, or have you been able to recreate it also?

So far I only tested by dragging the fingers around with the mouse but that’s maybe not the best way of testing this. Do you have a log file of the run from the video (or a similar one) that you could share? Assuming you did not change any properties of the simulation, I should be able to replicate the behaviour by replaying the actions from the log.

So far I couldn’t find anything obviously wrong with our setup. We are using the meshes for collision detection, maybe they are somehow in a format that is non-optimal for PyBullet. Maybe using a more simplified mesh or a set of primitive shapes like spheres, cylinders, etc. would help. I did not yet have the time to properly test this, though.

I read in PyBullet’s documentation, that “an object should not travel faster than its own radius within one timestep”, otherwise collision detection may have problems, but it does not seem like this is the case in the video you posted.

I was in fact changing some of the properties of the simulation (for domain randomization purposes) and it turns out this is causing the issue somehow. I tested the model in the non-domain randomized environment and it was unable to pass the cube through the robot arms etc.

So no need to worry, it seems your setup is fine. I’ll let you know if I figure out specifically what is causing the issue.

1 Like