Biden to Sign Landmark Executive Order on AI

admin
2 Min Read

The moves comes amid fears technology may worsen prejudice, drive out employees, and jeopardize national security

U.S. President Joe Biden issued the first executive order on artificial intelligence, demanding additional safety evaluations, fairness and civil rights recommendations, and study on AI’s influence on the job market.

The executive order, which took months to draft, represents White House worries that unfettered technology usage may seriously jeopardize privacy, public health, the economy, and national security.

As part of a technique known as “red teaming,” the directive mandates that businesses developing the most sophisticated AI systems do safety testing and report the findings to the government prior to product release. The directive requires businesses to share red-teaming results with the government by utilizing the Defense Production Act, a 1950 statute that has been used in recent emergencies like as the covid epidemic and the scarcity of infant formula.

According to a draft of the directive seen by The Washington Post, the order uses federal buying authority to require the government to employ risk management procedures when utilizing AI that might affect people’s rights or safety. According to the draft, agencies would have to keep an eye on and assess deployed AI on a regular basis.

Along with directing the government to create guidelines for businesses to watermark AI-generated content, the directive also asks agencies to consider how the technology can affect a variety of industries, including education, healthcare, and military.

According to Bruce Reed, the deputy chief of staff of the White House, these measures are “the strongest any government in the world has ever taken on AI safety, security, and trust.”

According to a statement from Reed, “it’s the next step in an aggressive strategy to do everything on all fronts to harness the benefits of AI and mitigate the risks.”

Share This Article
By admin
test bio
Leave a comment
Please login to use this feature.