Mind Bank AI is the latest company looking to implement an ambitious idea: to use artificial intelligence to give humanity immortality.
On January 17, 2020, when the world has not changed – 6 days later, Wuhan will be in confinement – Emil Jimenez is on a train from Vienna (Austria) to Prague (Czech Republic). He was accompanied by his 4-year-old daughter, who accidentally activated Siri while playing a game on her iPad.
“She asked, ‘Daddy, what is this?’” Jimenez said. He told her it was Siri and encouraged her to talk to Apple’s virtual assistant.
His first question was whether Siri had a mother.
From that day on, Jimenez’s daughter kept asking all kinds of questions that kids always wanted answered: Do you like ice cream? Do you like toys? – and when she was about to end the conversation, she would always say that she loved Siri, that this virtual assistant was her best friend.
Jimenez, who has a background in psychology, noticed this cute interaction and was amazed at how quickly and harmoniously his daughter formed a relationship with AI. For most of us, Siri is often “drunk” when the user needs to ask for something, even a joke.
But today’s generation of children is rapidly developing an affinity for devices, AI, and robots, in a completely different way from us, who were not born and raised in the era. of AI. .
Jimenez knows how Siri works – how a natural language processing algorithm understands your words, how a deep learning black box exists in the cloud, where Siri gets the answers to every question you ask.
And he had an idea.
“Today my daughter spoke to Siri. But someday in the future I want her to talk to me. Because I know I won’t live forever, and I love her so much …”
“What if I could still help him?” “
A digital twin
This is the story of your future. Or at least your probable future.
Jimenez’s aspirations led him to found Mind Bank AI, a startup whose mission is to break the chains of death and memory erasure – at least for the people you leave behind when you return to the world. outside world.
This company wants to create a clone of you, who can live forever and who can be called out to talk, joke or argue.
This “digital twin” imagined by Mind Bank AI will be built over the course of your life, from a set of data that is unique to you.
Through conversations – a combination of suggested topics and natural interactions – the AI will build a model that thinks like you, understands your nature, and applies the model in future situations, such as responding to others as if you were responding, and conversing. as if you were arguing.
” What’s wrong ? What do you want to eat ? How did you meet your wife? Why are you divorced? – Jimenez envisions Mind Bank AI asking all kinds of questions in the world, not unlike the conversations we have with each other when we get to know each other.
Jimenez wanted to create a digital twin with the ability to speak with your voice, because voice can be a powerful evocative trait that can help people’s minds imagine what you look like.
During the process of collecting personal data, users can talk to Mind Bank AI to reflect on themselves and understand each other better.
Jimenez sees interacting with Mind Bank AI as an opportunity for self-reflection, while helping the digital twin become more like you.
Say forever
After you die, Mind Bank AI really works, like a frozen digital lab.
A digital twin is the latest idea to fulfill a centuries-old human desire: to live immortal, or otherwise pass on knowledge, experiences and ideas in another form of existence.
While your perfect digital twin won’t take shape anytime soon, the remarkable capabilities of modern natural language processing systems (the deep learning programs behind Siri, Alexa …) and the awesome copy technologies still known as deepfakes gradually transform this idea into a fantasy. into something possible.
Several computer programmers and startups already have products similar to Jimenez’s vision.
Replika founder Eugenia Kuyda has already created a digital version of her best friend Roman Mazurenko.
“When she was in mourning, Kuyda read and reread the countless messages her friends had sent her over the years – thousands, ranging from the very common to the unforgettable.” – Casey Newton written by TheVerge. Since Mazurenko uses social media relatively little and his body has been cremated, his posts and photographs are all that is left.
At the time, Kuyda was developing Luka, a messaging app that allowed users to chat with bots. Using Mazurenko’s personal stories, she created a bot that can respond like her friend when called.
“She was confused as to whether she was doing the right thing to bring him back this way. Sometimes it gave her nightmares,” Newton continued.
There is also a deep division in their circle of friends: some refuse to interact with the Roman bot, while others find solace in it.
Then we have the Dadbot, created by James Vlahos. When his father was diagnosed with terminal lung cancer, Vlahos documented all he could to create the Dadbot – a form of “artificial immortality,” as Vlahos describes it.
Vlahos is currently the CEO of HereAfter AI, which specializes in designing “legacy avatars” constructed from actual interview recordings.
“They are digital characters built in the image of their creators,” said Vlahos. “He shares their life stories and memories, and their personalities, their nature, their way of speaking, their jokes, their ideas …”
You can interact with a legacy avatar just like you would Siri or Alexa, except that it will only answer personal questions – with the voice of the person it represents.
However, the digital twin that Mind Bank AI wants to create is more than that, and it presents great opportunities and challenges, both technical and philosophical.
How much time do we still have to make a replica of you that looks like the real thing? What decisions will we believe the copy is making? Do they help us grieve or sink deeper into our grief?
The architecture of the mind
Sascha Griffiths has been tasked with building the skeleton and mind of your digital twin.
Mind Bank AI’s co-founder and CTO is currently researching and coding the AI algorithms and tools that the startup says will be needed to copy you.
Such a clone would probably require the coordination of several IAs: thematic modeling (search for abstract concepts in the language); sentiment analysis (mainly recognition of emotions); and contradictory generative networks (GAN, the technology behind deepfakes).
As the project progresses into the later stages, Griffiths aims to develop more specialized algorithms to create a better digital copy for you.
But no algorithm is as important as natural language processing (NLP).
NLP is most often used in digital assistants and text prediction; GPT-3, which has been popular in the AI world for quite some time now, is an NLP that takes ideas from the internet to create realistic writing, from conversations to essays.
There will be a fundamental difference between these NLPs and your digital twin, says Ahmet Gyger, AI researcher and consultant at Mind Bank AI. These relationships are now transactional; you give an order or a question to NLP, they find the answer for you.
“This relationship will be strengthened in the future,” said Gyger, former technical manager of the Siri program. Mind Bank AI can help users understand how they feel about past events and create a dataset of someone’s life experiences.
“And once you get there, you wonder ‘how will this person react in the new form? “, and this is going to be a very interesting situation.”
By studying how you walk, text, and write, an NLP can fairly easily reproduce a relatively accurate version of you, as long as the conversation with that copy continues in turn, ask questions – get answers.
But having a real conversation, Griffiths thinks, is not yet possible. HereAfter AI’s “legacy avatar” may already be a reality, but Mind Band AI’s digital twin is still in the “future” – HereAfter AI was not scheduled to join the open conversation, which is what Vlahos calls it a “nightmare”
“Even if we could capture and interpret every verbal and non-verbal cue, a big problem would be creating new things for this ‘digital twin’ to express itself.” – according to Christos Christodoulopoulos, an applied scientist who worked on the Amazon Alexa team, not involved in the Mind Bank AI.
A lot of the things we do every day are “programmed” to some extent. When it comes to this type of interaction, the AI can already imitate us: ordering a cup of coffee can be seen as a programming script, but our meaningful and important interactions are not. “- wrote Christodoulopoulos.
“Consider comforting a friend after they break up, or sharing the joy with your partner when they get a promotion: if you stick to the formula, the ‘setup’ reactions, it’ll all be cliché – c is like interacting with a stranger “
Common sense
Among the challenges Mind Bank AI must overcome is understanding a person’s emotions, culture, and background. And to deliver an AI that is both flexible and able to handle a wide range of interactions, it will also need to overcome the fragile nature of AI.
AIs are fragile because they cannot function beyond what they know.
AI is fragile because it cannot function beyond what it knows. When it encounters an entry that it cannot recognize, it crashes.
Vered Shwartz, a researcher at the Allen Institute for AI and the University of Washington, provides an example.
“When the researchers tested GPT-3, they told the AI about a scene where a cat was waiting near the hole waiting for the mouse to appear. Tired of waiting, the cat was too hungry. When they asked to GPT-3 what the cat will do, he replied that he will go to the supermarket and buy some food. “
Clever, but wrong.
“It’s not human to make mistakes like that,” said Shwartz. “It’s usually due to a lack of common sense, which every adult human knows, but these models don’t really know.”
There are two main ways to solve the problem. One is to collect all the universal knowledge so that the AI can use it for training. Collecting large amounts of data will take decades.
“Putting it all together is impossible,” says Shwartz, not to mention the costs involved. And the knowledge in the books is biased, in which unusual things are mentioned more because that is what is worth recording.
Data is you – twins aren’t
While GPT-3 learns by borrowing from people on the internet, your digital twin is only interested in one set of data: you.
Of course, this raises privacy concerns, but the real ethical dilemma is when that data that doesn’t relate to you is turned into your digital twin.
Griffiths says Mind Bank AI’s digital twin won’t be you, but your representative. It is trained on your data; he will speak, express and think like you.
But it will not be your downloaded brain, nor your continuation of existence. He doesn’t grow, change, or learn like you.
So, can we trust your loved one’s digital sibling when it comes to important opinions?
There are even more complex issues. AI is better than humans at recognizing shapes, and NLP can detect patterns in your voice and thoughts that you don’t even know. Applying these things will create a digital twin who is more accurate, or better said, knows you better than you know yourself.
But if the twin’s AI focuses on a few specific models, it could create a more extreme, or perverted, version of you.
“One single piece of data can change everything in frightening ways,” says Susan Schneider, founding director of the Future Mind Center at Florida Atlantic University.
The algorithm behind the digital twin is still potentially corrupt, leading to serious failures like the situation where the cat goes to the supermarket.
The danger here is that AI’s ability to present credible and convincing arguments may outweigh its common sense. If we spot a flaw, we may lose confidence and feel alienated from our virtual friend, instead of being comfortable – and if we don’t, we risk getting ripped off.
Schneider is also concerned that the AI will force us to stand still. Is a digital twin so convincing that we can’t leave the past behind?
For Jimenez, the answer is probably yes. But the reverse can also be true.
In the face of grief, people often turn to religion, Jimenez said, looking for an answer that will never seem to distressing questions.
However, what if they could also find a digital twin? Your loved one’s digital twin may tell you it’s time to find someone new or encourage you to return to a passion you once pursued.