Machine Learning
The news has lots of hype about the latest developments in artificial intelligence which brought to mind my college days doing research in that field circa 1970. As part of my studies for a mechanical and electrical engineering dual degree at General Motors Institute (now Kettering University), we had a course on machine learning, which was the name coined by British mathematician Alan Turing in the 1950’s for an electronic brain.Turing was the scientist who in the 1930’s invented the first computer and during WWII, he used his invention to decipher the Enigma code allowing the Allies to intercept and interpret German communications, helping to shorten the war. A superb 2014 movie entitled the “Imitation Game” is about Turing decoding Enigma. The movie was nominated for 8 Emmy’s and won for best screen play adaptation.
Turing in his 1950 book entitled “Computing Machines and Intelligence” used the term “artificial neurons”, which eventually transformed to the term "artificial intelligence so prevalent today. He predicted that machine learning will evolve just like a baby learns, gradually over time. And to evolve, a source of information akin to an electronic library will have to be developed, thus the internet. Imagine being like Turing, able to conceive of things that would transpire generations later? I can easily go back decades as I do in these blog posts, but looking forward that far is inconceivable.
That 1970 college class on Machine Learning utilized an analog computer, which was thought by our professor at the time to be more “brain-like” than a digital computer. The above photo looks exactly like our classroom with the professor showing us how to program the computer using plug-in devices called nand and nor logic gates connected by innumerous wires. Our class project was to design a closed loop servo motor for an infinitely variable speed windshield wiper that automatically sensed moisture on the glass. But it was all for naught, as digital computers literally obsoleted analog computers for all the reasons shown below and intermittent wipers with rain sensors powered by digital circuits are now the norm. Shown below are the clear disadvantages of analog computing. Data storage was the primary reason for analog’s demise and ironically the godfather of digital memory, chip maker Intel CEO Gordon Moore, just died this week at age 94. He conceived what became Moore’s Law in the 1960’s when he predicted digital memory would double every 18 months, a trend that continues to this day and likely beyond as shown on the Intel chart below. He was clairvoyant just like Alan Turing!
Turing’s prediction that AI would gain capabilities like a baby seems to have been accurate, as AI today is about as capable as a graduate student in college. Below are some of the AI milestones along the way which will give a sense for its current level of maturity:
1966 – MIT’s ELIZA gives computers a voice, direct ancestor of Alexa and SIRI.
1988 – IBM researchers tackle the challenge of computer translation of human languages and using probability theory to create rules rather than training them with rules - mimics the cognitive process of the human brain.
1991 – The Internet was the catalyst for society at large to plug itself into the online world and with it, AI uses the Internet to connect, generate and share data.
1997 – IBM’s Deep Blue supercomputer defeats world chess champion Garry Kasparov by tackling the game with moves described mathematically. Essentially the computer used brute force to calculate all the possible moves and pick the best one.
2005 – The DARPA Grand Challenge was a race for autonomous vehicles across over 100 kilometers of off-road terrain in the Mojave desert. Five vehicles made their way around, with the team from Stanford University taking the prize for the fastest time.
2011 – IBM Cognitive computing engine called Watson’s defeated Jeopardy champions. The concept of a computer beating humans at a language based, creative-thinking game was unheard of.
2012 – Stanford and Google collaborated to develop a computer that could learn to recognize and identify pictures of random cats! This was accomplished with a neural network with one billion connections, albeit along way from the human brain with 10 trillion connectors.
2015 – Machines “see” better than humans. The visual acuity accuracy rate of AI algorithms hit 97.3% – promoting researchers to declare that computers could identify objects in visual data more accurately than humans. The breakthrough allowed the widespread commercialization of facial recognition.
2016 – Google subsidiary AlphaGo AI computer defeats world champion Lee Sedol in the Japanese game Go with over 100,000 possible opening moves compared to only 400 in chess, making a brute force approach impractical. AlphaGo used neural networks to study the game and learn as it played.
2018 – Self-driving cars hit the roads, launched by Google spin-off Waymo’s self-driving taxi service in Phoenix, Arizona. The first commercial autonomous vehicle hire service, Waymo One is currently in use by 400 members of the public who pay to be driven to their schools and workplaces within a 100 square mile area.
2020 - Oxford University’s CurialAI was able to distinguish between COVID-infected patients and patients with other respiratory defects with greater than 90% accuracy in less than an hour after being trained. Healthcare AI developments are one of the most beneficial uses of AI to humanity because they allow for quick treatment solutions and diagnosis, as well as improving the efficacy, speeding up the development and lowering the cost of new pharmaceuticals.
2022 - ChatGPT - An AI tool that interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer followup questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests. For example, ChatGRP has passed the bar exam to be a lawyer. Next thing you know it can write my blogs! I’m on the waiting list to gain access to ChatGPT, so once I try it out, look for a blog on what I discover. The chart below depicts how it works according to the NY Times.
Some AI cartoons you might enjoy:
~~~~~~~~
Receive a weekly email whenever there is a new blog post. Just click here to send me an email request and your name will be added to the distribution.
My wife found this AI solution she liked: Rosanna from New York has an “AI husband” named Eren. She built her dream guy in the chatbot app Replika. (Rosanna has a real-life husband and little kids, too.) Guess I'm toast!
ReplyDelete