Open alessiogambi opened 4 years ago
@masskro0 might be interested (as he created those images in the first place ;) )
Could you post the code that produced these images?
Guess that @masskro0 can do that 👍
Sorry for the late reply. Here is the code that I used:
# Setup camera.
direction = (0, 1, 0)
fov = 90
resolution = (1280, 720)
x, y, z = -0.3, 2.1, 1
camera = Camera((x, y, z), direction, fov, resolution, colour=True, depth=True, annotation=True)
ego.attach_sensor("camera", camera)
beamng = BeamNGpy('localhost', 64286)
bng = beamng.open()
bng.load_scenario(converter.scenario)
bng.start_scenario()
while True:
sensors = bng.poll_sensors(ego)
ego.update_vehicle()
img = sensors["camera"]["colour"].convert("RGB")
filename = label + '_{}.png'.format(time())
file_path = join(image_dir, filename)
img.save(file_path)
I'm going to check whether this problem occurs with the newest BeamNG.research version as well.
@aivora-beamng I experienced the same issue also in version BeamNG.tech.v0.31.3.0
. See image below.
Since we plan to use this vehicles for testing/training vision-based systems, the sprites creates a bias in the data
We noticed that the camera sensor while the ego-car is breaking (or going backward) produces images where a red/white light is visible (biasing AI/DL predictions).
You can see the attached images for an example. Note I clipped the images to fit in this page.
These images have been collected using the following configuration: