Posted on 02/24/2017 9:26:57 PM PST by BenLurkin
Since Wikipedia launched in 2001, its millions of articles have been ranged over by software robots, or simply bots, that are built to mend errors, add links to other pages, and perform other basic housekeeping tasks.
In the early days, the bots were so rare they worked in isolation. But over time, the number deployed on the encyclopedia exploded with unexpected consequences. The more the bots came into contact with one another, the more they became locked in combat, undoing each others edits and changing the links they had added to other pages. Some conflicts only ended when one or other bot was taken out of action.
...
While some conflicts mirrored those found in society, such as the best names to use for contested territories, others were more intriguing. Describing their research in a paper entitled Even Good Bots Fight in the journal Plos One, the scientists reveal that among the most contested articles were pages on former president of Pakistan Pervez Musharraf, the Arabic language, Niels Bohr and Arnold Schwarzenegger.
...
Yasseri believes the work serves as an early warning to companies developing bots and more powerful artificial intelligence (AI) tools. An AI that works well in the lab might behave unpredictably in the wild. Take self-driving cars. A very simple thing thats often overlooked is that these will be used in different cultures and environments, said Yasseri. An automated car will behave differently on the German autobahn to how it will on the roads in Italy. The regulations are different, the laws are different, and the driving culture is very different, he said.
As the authors note in their latest study: We know very little about the life and evolution of our digital minions.
(Excerpt) Read more at theguardian.com ...
How do these bots work? How do they know what to edit or is it a live person writes a paragraph and has it inserted in the website on a automatic schedule.
Sort of imagining an MMA cage match between miffed librarians.
I’ve yet to read a Wiki page without errors, but I believe they are human not bot. But news articles recently have become riddled with bot errors. I find many per day, usually syntax and grammar (which can alter meaning and clarity).
My educated guess is they scan through pages for particular errors they were programmed to correct. Pehaps looking for “your” used in a gramatical context of “you’re”. And then fixing the particular thing they scanned for.
Is it unpredictable when a bot does what it was programmed to do?
Worthless for anything outside of pop culture. And outside of prurient gossip, not of much value there.
Does this mean we’ll have to fight Skynet and Sky Net?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.