Smart algorithms can discriminate "inadvertently"

Smart algorithms can discriminate “inadvertently”

Am Sure you can learn successful Blogging secrets through me, am also sure you can learn how to make money online with the help of my updates why not leave your email behind let me show you how.

Companies will have to evaluate whether algorithms that feed their tools do not distort or discriminate Sophisticated analytical technologies used by banks and business decision-makers can unintentionally discriminate against some users. Large data and analysis can be made in different sectors discriminating against certain users.

click to register

The issue has become hot again after US financiers have warned that such distortions are very likely in the banking and insurance sectors and that the biggest social network is attacked by algorithms used to display certain ads to specific users. Financial “Big Data” Complex analytical technologies that banks and insurers use to make business decisions can unintentionally discriminate against some consumers.

Bitcoins

New regulatory agency for financial services in New York, Linda Leyselle, warned, Reuters writes. “Big data – it’s a risk and opportunity,” said Leicesell, senior NYDFS supervisor at the law firm. “We have to make sure we do it does not harm the end-user of the product, “Leeswell said. Leeswell’s commentary was given at a time when more and more insurance companies and banks rely on smart technology and automation to help them make decisions ranging from insurance decisions to bonuses to creditworthiness.

“We have a lot of work in the technology industry and that will be the focus for us,” said Layswell. Insurance companies have to evaluate the impact of the algorithms used and respond to it “I do not believe the person who creates such an algorithm will deliberately try to discriminate against someone and exclude the group, but what effect algorithm? “Lacewell asked rhetorically.

The subject is a continuation of earlier discussion of algorithms used by financial institutions to evaluate their users. The discussion began at the beginning of the year when the regulator was under the supervision of Maria Vallo, Lacywell’s predecessor. Under the guidance of the agency, the agency has issued a handbook that provides New Yorkers insurers with the use of social media and other unprofitable sources of prizes for their users.

However, he also stressed that handling such sources and data analysis should not “discriminate against certain clients”. Job and job ads Manipulations are also possible in social media, which can again lead to discrimination, revealed research related to Facebook. A new academic study has shown that Facebook algorithms can channel certain job ads in a discriminatory way – even when advertisers do not apply any stereotypes and react to their advertising equally visible to all potential interested parties.

A group of researchers from the Northeastern University of the United States, the University of Southern California and the Citizen’s Rights Group, Upturn, has launched commercials looking for people in the wood industry and pre-school teachers. Ads should be balanced by gender. However, Facebook has advertised the first among men, and the ad for teachers – more among women.

Facebook offers great opportunities for advertisers to choose who will see their ads by ensuring that diaper ads are sent primarily to young parents, and that call for funding for political candidates is primarily seen by their supporters. But even after advertisers decide to see messages evenly, Facebook automatically decides who can actually view anonymity, partly based on the “knowledge” of their algorithms – predicting which user is ready to click.

The Law of Discrimination Algorithms All this happens at a time when American legislators presented a bill that would require large companies to revise self-learning and artificial intelligence systems to ensure their “impartiality.” The proposed “Algorithmic Responsibility Law” will require the introduction of rules for the evaluation of “highly sensitive” automated systems.

Businesses will need to evaluate whether algorithms affecting their tools do not create distortion or discrimination and do not pose a risk to privacy or user security. In an account statement, one of his co-authors, Senator Ron Weidden, said that “computers are increasingly involved in making the most important decisions affecting people’s lives-whether they can buy a home, find a job, or even go to jail.”

In his words, these algorithms too often depend on biased assumptions or data that can actually increase the discrimination of certain groups of people. The announcement of the new law also mentions the case of Facebook ad. The argument is also about a similar problem with Amazonia, whose job ads in some cases only show men and discriminate against women.

Smart algorithms can discriminate “inadvertently”

You Can Earn Free Bitcoins Daily
Bitcoins

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *