Recently, there has been a rush among major technology companies, including Microsoft and Amazon, to launch their AI products. However, according to a report by CNBC, this rush is causing significant losses to the engineers involved in coding these projects. Many AI engineers from large technology businesses have expressed concerns that their superiors are pushing them too hard to get these products launched, prioritizing speed over ethics and safety.
An anonymous Microsoft engineer claimed that the company is competing in an “AI race,” and that Microsoft is prioritizing the speed of releasing AI products over concerns about ethics and safety. However, Microsoft did not comment regarding this report. The company recently published its first AI accountability and transparency report, in which it said it had established a Responsible AI Council to try to improve safety elements in its applications.
Similarly, an anonymous AI engineer from Amazon claimed that they were asked to work on an AI project during the weekend, despite previously making plans with friends. They worked all weekend to complete the task but eventually, the project had to be shut down. This engineer also echoed Microsoft employees’ comments that Amazon is trying to release new AI products faster instead of paying attention to accuracy and testing.
When contacted, an Amazon spokesperson said that the company is “focused on building and deploying useful, reliable, and secure AI innovations that reinvent and enhance customer experiences.” However, the spokesperson did not deny the report from the anonymous Amazon engineer, stating that it is “inaccurate and misleading to use an employee’s anecdote to describe the experience of all Amazon employees working in the field of AI.”
Since OpenAI launched ChatGPT-3 in November 2022, there has been a race among major technology companies to release their own AI services to lead the market. However, companies need to prioritize ethics, safety, accuracy, and testing, along with speed, when launching AI products.