Skip to content

AI is Not the Threat: Direction of Its Utility By People Can Be

2022-12-09

Author(s): Scott Douglas Jacobsen

Publication (Outlet/Website): Medium (Personal)

Publication Date (yyyy/mm/dd): 2019/05/02

Nature reported on a pressing and prescient warning of the dangers of a neutral tool: artificial intelligence. What is the threat of a neutral tool?

Of course, the threat comes in the form of the uses or utility functions provided to the AI by human beings, either as individuals or collectives.Nonetheless, Benkler reported on the ways in which private industry or industry in general continues to shape the ethic and, thus, the utility functions of a powerful and sophisticated hammer, artificial intelligence.May 10, 2019, is the due date for letters of intent to the National Science Foundation of the United States constructed for a new funding program entitled Fairness in Artificial Intelligence.This follows from the European Commission “Ethics Guidelines for Trustworthy AI.” It was described, byan academic member of the commission, as “ethics washing” with the utter industry domination of the content.Google formed an AI ethics board in March, which fell apart in a week based on controversy. Even earlier, in January, Facebook invested 7.5 million USD into an ethics and AI centre at the Technical University of Munich, Germany.What does this mean for the direction of the future of AI and its ethic schemata? It means the blueprints are being laid by the chickens of industry.The input from industry, according to Benkley, remains crucial for the development of the future of AI. However, there should not be a monopolization of the power and the ethics.Both governments and industry should be transparent and publicly accountable in the development of the ethical frameworks developed for AI.Benkley stated, “Algorithmic-decision systems touch every corner of our lives: medical treatments and insurance; mortgages and transportation; policing, bail and parole; newsfeeds and political and commercial advertising. Because algorithms are trained on existing data that reflect social inequalities, they risk perpetuating systemic injustice unless people consciously design countervailing measures.”He provided an example of artificially intelligent systems capable of predicting recidivism. Those who differentially affect black and white, or European and African heritage communities.In addition, or similarly, this could impact policing and job candidacy of applicants. With the black box of the inclusion of algorithms and systems into an artificial intelligence, these could simply reflect the societal biases, which would be “invisible and unaccountable.”“When designed for profit-making alone, algorithms necessarily diverge from the public interest — information asymmetries, bargaining power and externalities pervade these markets,” Benkley stated, “For example, Facebook and YouTube profit from people staying on their sites and by offering advertisers technology to deliver precisely targeted messages. That could turn out to be illegal or dangerous.”More in the reference…ReferencesBenkler, Y. (2019, May 1). Don’t let industry write the rules for AI. Retrieved from https://www.nature.com/articles/d41586-019-01413-1?utm_source=twt_nnc&utm_medium=social&utm_campaign=naturenews&sf211946232=1.

License

In-Sight Publishing by Scott Douglas Jacobsen is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Based on a work at www.in-sightpublishing.com.

Copyright

© Scott Douglas Jacobsen and In-Sight Publishing 2012-Present. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Scott Douglas Jacobsen and In-Sight Publishing with appropriate and specific direction to the original content. All interviewees and authors co-copyright their material and may disseminate for their independent purposes.

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: