responsible-ai-collaborative / aiid

The AI Incident Database seeks to identify, define, and catalog artificial intelligence incidents.
https://incidentdatabase.ai
Other
170 stars 35 forks source link

Error when submitting incident with dissimilar incident marked #2006

Closed lmcnulty closed 1 year ago

lmcnulty commented 1 year ago

I tried to submit a report, marking one of the surfaced related incidents as not related, and it gave me an error which I couldn't fix until marking it as "not sure" instead.

This may be the same problem as https://github.com/responsible-ai-collaborative/aiid/issues/1969.

Failed request body:

{
  "operationName": "InsertSubmission",
  "query": "mutation InsertSubmission($submission: SubmissionInsertInput!) {\n  insertOneSubmission(data: $submission) {\n    _id\n    __typename\n  }\n}",
  "variables": {
    "submission": {
      "authors": [
        "Jonathan M. Gitlin"
      ],
      "cloudinary_id": "reports/cdn.arstechnica.net/wp-content/uploads/2023/05/FSD-screenshot-760x380.jpg",
      "date_downloaded": "2023-05-16",
      "date_modified": "2023-05-16",
      "date_published": "2023-05-16",
      "date_submitted": "2023-05-16",
      "deployers": {
        "link": []
      },
      "developers": {
        "link": [
          "tesla"
        ]
      },
      "editor_dissimilar_incidents": [
        null
      ],
      "editor_notes": "",
      "editor_similar_incidents": [
        478,
        20,
        4,
        71
      ],
      "embedding": {
        "from_text_hash": "e62bdc897501d3471e8b8f2bbc2fc6b22921a96a",
        "vector": […]
      },
      "harmed_parties": {
        "link": []
      },
      "image_url": "https://cdn.arstechnica.net/wp-content/uploads/2023/05/FSD-screenshot-760x380.jpg",
      "incident_date": "2023-05-14",
      "incident_ids": [],
      "language": "en",
      "nlp_similar_incidents": [
        {
          "incident_id": 20,
          "similarity": 0.9983685612678528
        },
        {
          "incident_id": 71,
          "similarity": 0.998238205909729
        },
        {
          "incident_id": 4,
          "similarity": 0.9981974363327026
        }
      ],
      "plain_text": "Enlarge / This Tesla can clearly detect the pedestrian as they appear on the infotainment display. But the car continues past them, only slowing from 26 mph to 24 mph after it passes the crosswalk. California law requires drivers to come to a complete stop for pedestrians at crosswalks.\n\n\n\nTesla released a new version of its controversial \"Full Self-Driving Beta\" software last month. Among the updates in version 11.4 are new algorithms determining the car's behavior around pedestrians. But alarmingly, a video posted to Twitter over the weekend shows that although the Tesla system can see pedestrians crossing the road, a Tesla can choose not to stop or even slow down as it drives past.\n\nThe video was posted by the Whole Mars Catalog account, a high-profile pro-Tesla account with more than 300,000 followers. The tweet, which has been viewed 1.7 million times, featured a five-second video clip with the accompanying text:\n\nOne of the most bullish / exciting things I've seen on Tesla Full Self-Driving Beta 11.4.1.\n\nIt detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so.\n\nThe person posting the video then clarified that it was filmed in San Francisco and that anyone not OK with this driving behavior must be unfamiliar with city life. (As someone who has lived in big cities all his life, I am definitely not OK with cars not stopping for pedestrians at a crosswalk.)\n\nMost partially automated driving systems like General Motors' Super Cruise or Ford's BlueCruise are geofenced to a controlled operational domain—usually restricted-access divided-lane highways. Tesla has taken a different approach, though, and allows users to unleash its FSD beta software on surface streets.\n\nNot everyone is as comfortable with Tesla drivers road-testing unfinished software around other road users. In February, the National Highway Traffic Safety Administration told Tesla to issue a recall for nearly 363,000 vehicles with the software installed.\n\nThe agency had four principal complaints, including that the \"FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.\"\n\nThe version 11.4 update in April was supposed to improve how the cars behaved, but there's now more evidence that the FSD Beta still leads to Teslas breaking traffic laws. Section 7 of California's Driver's Handbook, which deals with laws and rules of the road, says that pedestrians are considered vulnerable road users and that \"pedestrians have the right-of-way in marked or unmarked crosswalks. If there is a limit line before the crosswalk, stop at the limit line and allow pedestrians to cross the street.\"\n\nThis is not the first time Tesla's software has been programmed to break traffic laws, either.\n\nFSD is “make or break” for Tesla\n\nTesla CEO Elon Musk has repeatedly talked about the importance of FSD to his company, saying that it is \"make or break\" for Tesla and that it's the difference between Tesla being \"worth a lot of money or worth basically zero.\"\n\nFSD Beta has been implicated in a number of crashes and is the subject of several of the open federal investigations into Tesla's electric vehicles. The option now costs $15,000, and each time the automaker declares another feature \"complete,\" it allows the company to recognize some of the deferred revenue it has been collecting as payments for the software.\n\nDespite that bold stance in public, Tesla has been far more circumspect when dealing with authorities—in 2020, it told the California Department of Motor Vehicles that it did not expect FSD to become significantly more capable and that it would never pass beyond so-called SAE level 2, which requires an alert human in the driver's seat who remains liable for the car's actions.\n\nOr, as author Ed Niedermeyer more concisely put it, \"Full Self-Driving\" is not, and never will be, actually self-driving.\"\n\nTesla is holding its annual shareholder meeting later today in Texas.\n",
      "source_domain": "arstechnica.com",
      "submitters": [
        "Luna McNulty"
      ],
      "tags": [],
      "text": "[Enlarge](https://cdn.arstechnica.net/wp-content/uploads/2023/05/FSD-screenshot-scaled.jpg) / This Tesla can clearly detect the pedestrian as they appear on the infotainment display. But the car continues past them, only slowing from 26 mph to 24 mph after it passes the crosswalk. California law requires drivers to come to a complete stop for pedestrians at crosswalks.\n\n---\n\nTesla released a new version of its controversial \"Full Self-Driving Beta\" software last month. [Among the updates in version 11.4](https://www.notateslaapp.com/software-updates/version/2023.6.15/release-notes) are new algorithms determining the car's behavior around pedestrians. But alarmingly, a video posted to Twitter over the weekend shows that although the Tesla system can see pedestrians crossing the road, a Tesla can choose not to stop or even slow down as it drives past.\n\nThe video was posted by the Whole Mars Catalog account, a high-profile pro-Tesla account with more than 300,000 followers. [The tweet](https://twitter.com/WholeMarsBlog/status/1657807019703943169?s=20), which has been viewed 1.7 million times, featured a five-second video clip with the accompanying text:\n\n> One of the most bullish / exciting things I've seen on Tesla Full Self-Driving Beta 11.4.1.\n> \n> It detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so.\n\nThe person posting the video then clarified that it was filmed in San Francisco and that anyone not OK with this driving behavior [must be unfamiliar with city life](https://twitter.com/WholeMarsBlog/status/1657940431026667520?s=20). (As someone who has lived in big cities all his life, I am definitely not OK with cars not stopping for pedestrians at a crosswalk.)\n\nMost partially automated driving systems like General Motors' Super Cruise or Ford's BlueCruise are geofenced to a controlled operational domain—usually restricted-access divided-lane highways. Tesla has taken a different approach, though, and allows users to unleash its FSD beta software on surface streets.\n\nNot everyone is as comfortable with Tesla drivers road-testing unfinished software around other road users. In February, [the National Highway Traffic Safety Administration told Tesla to issue a recall](https://arstechnica.com/cars/2023/02/tesla-to-recall-362758-cars-because-full-self-driving-beta-is-dangerous/) for nearly 363,000 vehicles with the software installed.\n\nThe agency had four principal complaints, including that the \"FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.\"\n\nThe version 11.4 update in April was supposed to improve how the cars behaved, but there's now more evidence that the FSD Beta still leads to Teslas breaking traffic laws. [Section 7 of California's Driver's Handbook](https://www.dmv.ca.gov/portal/handbook/california-driver-handbook/laws-and-rules-of-the-road/#:~:text=Pedestrians%20have%20the%20right%2Dof,and%20be%20prepared%20to%20stop.), which deals with laws and rules of the road, says that pedestrians are considered vulnerable road users and that \"pedestrians have the right-of-way in marked or unmarked crosswalks. If there is a limit line before the crosswalk, stop at the limit line and allow pedestrians to cross the street.\"\n\nThis is not the first time Tesla's software [has been programmed to break traffic laws, either](https://jalopnik.com/teslas-fsd-betas-driving-modes-bring-up-interesting-eth-1848331683).\n\nFSD is “make or break” for Tesla\n--------------------------------\n\nTesla CEO Elon Musk has repeatedly talked about the importance of FSD to his company, saying that it is \"make or break\" for Tesla and that it's the difference between Tesla being \"worth a lot of money or worth basically zero.\"\n\nFSD Beta has been [implicated](https://electrek.co/2021/11/12/tesla-owner-claims-first-full-self-driving-beta-crash-strange-nthsa-complaint/) in a [number](https://electrek.co/2022/11/16/tesla-reports-two-fatal-crash-autopilot-fsd-beta/) of [crashes](https://arstechnica.com/cars/2022/12/eight-car-thanksgiving-pileup-blamed-on-tesla-full-self-driving-software/) and is the subject of several of the open federal investigations into Tesla's electric vehicles. The option now costs $15,000, and each time the automaker declares another feature \"complete,\" [it allows the company to recognize some of the deferred revenue](https://arstechnica.com/cars/2023/04/with-tesla-profits-down-musk-dangles-cybertruck-fsd-this-year/) it has been collecting as payments for the software.\n\nDespite that bold stance in public, Tesla has been far more circumspect when dealing with authorities—in 2020, it told the California Department of Motor Vehicles [that it did not expect FSD to become significantly more capable](https://arstechnica.com/cars/2021/03/tesla-full-self-driving-beta-isnt-designed-for-full-self-driving/) and that it would never pass beyond so-called SAE level 2, which requires an alert human in the driver's seat who remains liable for the car's actions.\n\nOr, [as author Ed Niedermeyer more concisely put it](https://twitter.com/Tweetermeyer/status/1368678998679560193), \"Full Self-Driving\" is not, and never will be, actually self-driving.\"\n\nTesla [is holding its annual shareholder meeting](https://www.tesla.com/2023shareholdermeeting) later today in Texas.",
      "title": "Tesla’s “Full Self-Driving” sees pedestrian, chooses not to slow down",
      "url": "https://arstechnica.com/cars/2023/05/teslas-full-self-driving-sees-pedestrian-chooses-not-to-slow-down/",
      "user": {
        "link": "63601cdc29e6840df23ad3e5"
      }
    }
  }
}

Response:

{
  "data": {
    "insertOneSubmission": null
  },
  "errors": [
    {
      "message": "reason=\"role \\\"role is admin\\\" in \\\"aiidprod.submissions\\\" does not have insert permission for document with _id: ObjectID(\\\"6463e2e6ab0d491cddddd15b\\\"): could not validate document: \\n\\teditor_dissimilar_incidents.0: Invalid type. Expected: undefined, given: null\"; code=\"SchemaValidationFailedWrite\"; untrusted=\"insert not permitted\"; details=map[]",
      "locations": [
        {
          "line": 2,
          "column": 3
        }
      ],
      "path": [
        "insertOneSubmission"
      ]
    }
  ]
}
pdcp1 commented 1 year ago

@lmcnulty Could you add some steps to reproduce it? I assume this is happening on Production, right?

pdcp1 commented 1 year ago

According to your example and the error message, the problem is this null value:

"editor_dissimilar_incidents": [
  null
],

But I couldn't reproduce it on Production or Staging

pdcp1 commented 1 year ago

@lmcnulty I couldn't reproduce this, can you provide the steps to reproduce it? If it's not reproducible, feel free to close this issue. Thanks!

smcgregor commented 1 year ago

Ping @lmcnulty

pdcp1 commented 1 year ago

I will close this issue since we cannot reproduce it and the submission form changed dramatically since then.