spot_img
6 C
London
spot_img
HomeAI & Machine LearningState Want to Regulate AI. Why the US Congress may backwards away

State Want to Regulate AI. Why the US Congress may backwards away

Under a proposal being considered in the US House of Representatives, says wouldn’t be able to maintain their own rules on artificial intelligence technology for ten years.

No state or political division “may enforce any laws or rules regulating artificial intelligence designs, artificial intelligence systems, or automatic determination systems” for ten years, according to the policy, which will be considered by the House Energy and Commerce Committee on Tuesday.

Before it becomes law, the plan would need the consent of both of the Congress’s tanks and President Donald Trump.

AI Atlas

Federal action is required to stop states from instituting a symbiotic network of diverse rules and requirements across the nation, according to AI developers and some lawmakers, which was stifle the growth of the technology. Since ChatGPT exploded on the field at the end of 2022, the fast development of relational AI has led businesses to meet the systems in as many areas as possible. The US and China compete to see which nation’s technology will dominate, but relational AI poses protection, transparency, and other consumer risks that lawmakers have tried to attenuate.

During an April congressional hearing, Alexandr Wang, founder and CEO of the statistics firm Scale AI, told lawmakers that” we need, as an economy and as a nation, one distinct federal standard, whatever it may be.” However,” we need one, we need clarity regarding one federal standard, and we need nullification to avoid this situation where there are 50 different requirements.”

A technology that is extremely ingrained in every aspect of American life may be subject to efforts to restrict state ‘ ability to regulate it. &nbsp,

Anjana Susarla, a teacher at Michigan State University who studies AI, said,” There have been a lot of debate at the state level, and I do think it’s important for us to view this issue at many amounts.” We may take it at the local levels. We can also take it in the position. I believe we need both.

AI has already been regulated by says.

The proposed speech may prohibit state from enforcing any law, even those that are already in place. There are exceptions, including laws and regulations that facilitate the development of AI, and those that apply the same standards to non-AI designs and systems that perform related tasks.

These kinds of laws are now starting to appear. The main focus is not in the US, but rather in Europe, where the European Union has now put forth requirements for AI. However, state are starting to get involved. &nbsp,

A number of customer privileges were passed in Colorado last year and will start in 2026. California passed more than a few laws in the area of AI next month. Different states have laws and rules that frequently address issues like deepfakes. &nbsp,

According to the National Conference of State Legislature, position legislators have submitted at least 550 ideas in the area of AI so far in 2025. &nbsp,

Rep. Jay Obernolte, a Republican from California, made the announcement at the April House commission hear that he wanted to avoid more state-level regulations. He said,” We only have a limited amount of legislative runway to get that problem resolved before the states go too far ahead.”

What would a ban on AI rules in the state mean?

Artificial developers have requested that any guardrails be placed on their work to maintain consistency and streamlinedness. Sen. Ted Cruz, a Republican from Texas, was told last week by OpenAI CEO Sam Altman that an EU-style governmental program “would be devastating” for the business at a hearing held by the Senate Committee on Commerce, Science, and Transportation. Instead, Altman suggested that the business create its own standards. &nbsp,

Sen. Brian Schatz, a Democrat from Hawaii, questioned whether business self-regulation was currently sufficient. Altman said he thought some scaffolding would be useful, but that “it’s easy for it to go very much.” As I’ve learned more about how the earth functions, I’m more concerned that it might go too far and have disastrous effects.

( Disclosure: CNET’s parent company, Ziff Davis, sued OpenAI in April, alleging that it violated Ziff Davis ‘ copyrights when it trained and ran its AI systems. ) &nbsp,

Consumer advocates contend that there needs to be more requirements, and that limiting says ‘ ability to do so could lead to user privacy and security issues. &nbsp,

Ben Winters, chairman of AI and privacy at the Consumer Federation of America, stated in a statement that” AI is being used frequently to make decisions about women’s existence without transparency, accountability, or remedy.” As a result, it is also fostering chilling fraud, imitation, and surveillance. A 10-year delay may result in more prejudice, more fraud, and less control, or, to put it another way, favors tech companies over the people they affect.

According to Susarla, the widespread use of AI across different sectors means that states might be able to enact laws like private and accountability more widely, without focusing on AI. However, a ban on AI rules may cause these regulations to be ensnared in lawsuits. &nbsp,

There must be some sort of balance between the statement,” We don’t want to stop advancement,” and the fact that there can be true consequences, she said.

spot_img

latest articles

explore more

LEAVE A REPLY

Please enter your comment!
Please enter your name here

en_USEnglish