Notes on A.I. (Artificial Intelligence)
A.I. Is a much more sophisticated algorithm . See my past post on Algorithms
Post in thread 'Dance Your Cares Away/Fraggle/Law Abiding Citizens'
Search engines use a spider to index all the words on a web page. The more times a word is used in context the spider counts it.
When someone does a search the algorithm calculates the most relevant sites based on these word count scores and the past likes and searches of the searcher.
(Important note: As mentioned in the above linked post, algorithms can have programmed multiplies coded in them to skew results to the programmer‘s wants. The same can and does occur in A.I.)
A.I. Will soon replace the algorithm function for search engines. The A.I. will do much more when it indexes web sites. It will comprehend the context…. That is a significant difference.
The first example of this was about 10 years ago when IBM used their A.I., named Watson, to help doctors by reading all old and new medical studies. Watson could then offer different treatment options for difficult medical cases. This was at first lauded as a medical breakthrough, but since has declined in use. The problem was that any A.I. is not intelligent. It is just an advanced algorithm and cannot use logic. It can only accumulate a lot of data and then regurgitate that data.
A common thesis on computers has been: ”Trash in = Trash out”
The newest A.I. can answer a query in a seemingly intelligent manner that can be verbose and difficult to know it’s not from a human response. They now have programs used by schools to identify A.I. created answers from students. Some real reports written by students were flagged as probable A.I. created. That’s a problem. How do you prove you wrote it? Students better know what they write because they will soon have to prove it with a verbal and in person Q an A.
I am sure you all are familiar with bots. Bots are programs that create and use social media accounts to act like a real person to like or not like and even comment on social media. These bots have been a problem and have caused due distrust on rankings. Bots are used to add false data that algorithms process so that the search results are skewed.
The same will happen with A.I.
There will soon be A.I. that creates content rich websites. These sites can have hundreds of pages of data that was created by an A.I. that was programmed for a nefarious goal. A.I. can create thousands of these websites that each have hundreds of pages of data, all looking different, all having the same overall misinformation goal. All created in less than one minute.!
Then A.I. used by search engines will immediately read and comprehend this data and will post results that are skewed toward the misinformation goal. The search engine A.I. will never have common sense to know this data is wrong. It just know that many sites have said the same, so it must be relevsnt.
Trash In = Trash out