International Business Machines (IBM) announced five technology developments likely to occur within five years.
Their first prediction says cognitive interpretation of our spoken and written words using artificial intelligence (AI) “will be used as indicators of our mental health and physical well-being.”
Advanced AI computing systems will analyze human speech patterns, along with what we write; interpreting its meaning, syntax, and intonation.
These AI systems will provide physiological and psychological data that doctors will use in their diagnosis of patients.
It would be interesting having an AI cognitive computing system evaluate me based on its reading of one of my columns in 2022.
What data would be provided to the doctor regarding its physiological and psychological assessments of this columnist?
I can hear the good doctor telling me; “Yes Mr. Ollig, I heard what you said; however, the computer feels differently. It strongly suggests you be placed on this neurotic-behavior adjusting medication. This decision is based on the emotional inflections the AI program deciphered in your last Bits & Bytes column.”
With my luck, the doctor’s autonomously-thinking AI computer’s name will be HAL 9000.
Tell me again we are not entering into Isaac Asimov’s vision of the future.
On the other hand, there could be value in having an advanced AI word-processing program monitoring what I write.
The program could provide me with constructive and cognitive audible feedback for my column writing.
Folks, I’ll let you in on a secret. For years, your humble columnist has used a natural-sounding TTS (text-to-speech) program for audibly reading my column to me during its various drafts.
Hearing the words, sentences, and paragraphs helps me to shape, sculpt, and fashion the column until I feel it’s ready to be “sent off to the presses.”
The TTS reader program is one of the tools I use from my writing tool pouch.
Would I eventually find annoying having an advanced, futuristic AI TTS program giving me verbal suggestions and opinions while I was writing?
I might become overly defensive, and so upset, I’d delete the AI program, and drag out of mothballs my old Smith-Corona typewriter I used in the late 1970s.
But I digress.
Were you aware of some story articles being written by computers?
There are stock market reports and other articles folks are reading today which, unbeknownst to them, were written by computer programs and automated news-writing bots called “robo-journalists.”
How ironic would this be: future AI computers and robo-bots performing physiological and psychological evaluations on each other?
The second IBM prediction has people seeing things which are today invisible, by using hyperimaging and AI technology incorporated into special eyeglasses.
With these glasses, we’ll be able to see what currently are invisible microwave and millimeter wave images.
The glasses will also come in handy for seeing other cars or road hazards once hidden by heavy fog or the dark of night.
Today, an estimated 300 million people world wide cope with color vision deficiency, or color blindness.
I just learned about a company called EnChroma, http://enchroma.com, based in Berkeley, CA.
They recently developed what I consider are miraculous, color-giving eyeglasses that are changing lives for those living with red-green color blindness.
I viewed an emotionally-moving video showing people’s reactions when seeing colors for the first time after putting on these extraordinary glasses.
Here is the link to the video, http://bit.ly/29zkfCY.
The third prediction is about macroscopes.
Macroscope technology will collect and organize tremendous amounts of data hidden within physical objects, and from billions of interconnected devices; which I assume are Internet of Things (IoT) devices.
This analytical technology will be used to learn how people, places, and devices are interconnected, and serve to solve some of the world’s current challenges, according to IBM.
Their fourth prediction describes “medical laboratories” imprinted on computing chips.
Health-related nanotechnology will be embedded on small computing chips inside an electronic medical device (think “Star Trek” medical tricorder).
Doctors will be able to trace diseases at the nanoscale level, and provide medical prevention measures before we even experience the symptoms of the disease or illness.
In the future, there might be a real Dr. McCoy saying to a real Captain Kirk; “I’ll pick up my medical tricorder and meet you in the transporter room.”
Yes, I know we have no transporter room . . . yet.
IBM’s fifth prediction says within five years, we’ll use “speed of light networking” to detect sources of environmental pollution.
Smart computing chips implanted in devices underground, and smart sensors attached to autonomous flying drones, will become linked together to form a global network.
This network will, in real-time, detect sources of pollution and monitor environmental quality on a planetary scale.
Current unseeable methane leaks and other invisible pollutants will become observable from their source anywhere on the planet, the instant they occur.
Here are the latest IBM videos, http://bit.ly/1Dr3o9x.
Check out and follow my non-computer written messages on Twitter at @bitsandbytes