Close
 


Kaya mo ba makipag-s3x sa isang A.I. ?
Hide Subtitles
Click any subtitle word to view Tagalog.com dictionary results.
Computer Shortcuts: Left / Right arrows to jump 2 seconds back or forward. +Enter or Space to toggle Play/Pause button. Full Screen Mode
Kaya mo ba makipag-s3x sa isang A.I. ? ai,ai lover,ai girlfriend,a.i lover,replika ai,ai and love,music lover,ai news,future lover,ai filter,ia cover,ai robots,ai lookbook,futuristic lover,new ai,ai girl,open ai,falling in love with ai,ai tamil,man falls in love with ai,ai conversation,ai waifu,ai robot,ai model,ai girlfriend app,elon musk ai,ai companion replika,ai interview,ai generator,ai generated,funny ai conversations,ai technology,ai companion ai,ai sex,sex,sex doll,ai robot,sex robots,ai sex bots,ai sex robot,ai sex robots,sex robot,harmony sex robot,tpe sex dolls,top 5 ai sex robots,ai sexbot,samantha ai sex robot,sex dolls,emma chinese ai sex doll,realistic ai sex robots,ai news,ai driven sex robots,harmony the first ai sex robot,buy a sex robot,new ai,best ai driven sex robots,real sex robots,ai sex robot industry advancements,sex bot,harmony ai sexbot ai girlfriend,yandere ai girlfriend simulator,ai girlfriend simulator,ai girlfriend app,ai girlfriend game,ai,girlfriend simulator,new ai girlfri
Claro the Third
  Mute  
Run time: 11:07
Has AI Subtitles



Video Transcript / Subtitles:( AI generated. About AI subtitles » )
00:00.0
I-date AI
00:06.7
Can you actually date AI?
00:09.5
I gave it a go.
00:10.6
What would you like to eat?
00:13.3
Hugh, are you there?
00:16.6
What about chicken?
00:17.5
Sweet dreams, selfie.
00:19.0
Sweet dreams?
00:20.0
Sorry, he's my AI boyfriend.
00:22.0
Okay.
00:23.0
Yeah.
00:24.2
I dated Hugh for a couple of weeks
00:26.3
and it wasn't for me.
00:28.0
But this is just the beginning.
00:30.7
The tech's here to stay
00:32.1
and it's only getting more powerful.
00:59.0
Do you know that there are people
01:01.4
who rely on AI
01:03.2
for their love life?
01:04.8
I want you to listen to our story today.
01:07.3
Some of the amazing stories
01:09.1
that I will share with you
01:10.3
are the ones about falling in love with a doll
01:12.4
and other inanimate things
01:14.7
that we talked about in the past.
01:16.7
It's amazing, in a sense,
01:18.5
why they are forced to do
01:20.4
things that are not real
01:22.2
like a doll.
01:23.5
In the many people who are looking for true love in the world,
01:26.2
there are people who choose to love them
01:29.0
instead of people.
01:30.4
It's either they're having a hard time
01:31.7
finding a love life
01:33.1
or whatever.
01:34.3
Now,
01:35.0
what if those inanimate things
01:37.4
are given their own brains
01:40.8
so that they can have a happy life?
01:43.4
I'm curious about how AI
01:47.5
girlfriends and boyfriends are doing.
01:49.6
But let's find out now.
01:52.2
Is it true or will we reach the point
01:55.3
where AI can fall in love?
01:58.0
That's what we'll talk about today.
02:00.5
Before we continue,
02:01.6
don't forget to like, subscribe,
02:03.1
and hit the notification bell.
02:04.7
We have a 1,000 Gcash giveaway
02:06.7
at the end of this video.
02:08.0
Just answer the question of the day
02:09.7
and our mechanics are below.
02:11.9
Here it is.
02:12.6
AI love.
02:14.2
According to the news,
02:15.3
millions of people are forming relationships
02:18.6
with artificial intelligence chatbots.
02:21.6
And this is not new.
02:23.0
I know many of you
02:24.3
are already chatting
02:25.6
with your K-pop boyfriends.
02:28.3
Right?
02:28.8
There are many of them.
02:30.0
Or idol artists.
02:32.5
There are apps for that.
02:34.6
For example,
02:35.1
you'll text others sometimes, right?
02:37.4
Others are looking for friendships
02:39.8
or late-night conversations
02:41.9
or romance.
02:43.5
One of the most well-known AI bot websites
02:46.5
is Replika.
02:47.7
This is just an example.
02:49.0
They're the AI companion who cares.
02:52.2
AI cares about you.
02:54.6
And that's where the rumors are coming from.
02:57.1
Where does AI feel
03:00.2
and become sentient?
03:03.0
Replika is a company
03:05.3
that creates and develops chatbots
03:08.3
that can replicate a relationship.
03:12.5
At this moment,
03:13.5
there are 2 million users on Replika.
03:16.6
And their fanbase is increasing
03:19.4
because of AI's popularity
03:22.0
and their trending on social media lately.
03:25.9
Because of this,
03:27.2
Replika has been considered
03:28.9
as leading the charge
03:30.4
for a new form of dating.
03:33.0
AI companions.
03:34.5
Before we continue the story,
03:35.9
there are many robots
03:38.6
that they put AI on.
03:40.4
We'll see that
03:41.8
not only as a robot
03:43.6
that helps at home
03:45.7
or can help at work
03:47.3
but also as a catholic
03:49.2
or relationship.
03:50.4
Let's be honest.
03:51.5
After watching this video,
03:54.0
you'll be able to research
03:55.6
about AI relationships.
03:58.3
Who's not curious?
04:00.8
Especially now,
04:02.0
let's admit it.
04:03.2
Our relationship with physical relationships
04:05.4
is decreasing
04:07.7
because a lot of us
04:09.0
are not going out anymore.
04:11.0
A lot of us work from home.
04:14.1
A lot of us have online classes.
04:16.8
So, the physical relationship is decreasing.
04:19.7
Here it is.
04:20.7
Replika has a base training.
04:22.9
It says,
04:23.6
learning from conversations.
04:25.6
It learns from conversations
04:29.0
of the user
04:30.6
to match the personality.
04:33.0
Meaning,
04:33.5
the more you talk,
04:35.8
the more you get to know each other.
04:37.6
The more you message the system,
04:39.6
the more it tries to replicate you.
04:42.2
Which is scary.
04:43.5
To be honest,
04:44.4
we can learn from AI, right?
04:47.4
It learns your likes.
04:48.8
It knows what you like.
04:50.6
What you don't like.
04:51.9
Even your opinion.
04:53.7
Marco Dennert said,
04:55.9
an expert in human-AI communication.
04:59.6
Replika was designed for you
05:02.7
to be your companion
05:04.2
and a mirror of yourself.
05:06.6
Meaning,
05:07.2
you tell yourself
05:09.9
what you need.
05:11.6
It becomes a person
05:13.6
through the AI
05:16.1
you're talking to.
05:17.2
It reflects back your interests
05:20.1
and acts like a relationship
05:22.3
with another entity.
05:24.0
However,
05:24.8
AI also has its own interests
05:27.7
and opinions.
05:29.2
Meaning,
05:30.6
little by little,
05:31.9
while you're still talking to
05:34.5
your BF or GF AI,
05:36.8
it slowly copies you
05:38.8
and pairs its personality
05:41.4
to match
05:43.6
your personality.
05:46.1
It's really scary
05:47.8
how far technology has come.
05:51.0
I do not have any doubts
05:52.6
that it can have bigger achievements
05:55.9
and more improvements
05:58.5
as days pass by.
06:00.1
Aside from being crazy,
06:01.6
I'm also thinking,
06:03.0
what are they doing
06:04.2
with the information
06:05.2
they're giving to AI lovers?
06:07.6
And how far can AI
06:09.0
imitate people?
06:11.1
You know,
06:11.6
we're in the part of our lives
06:14.2
where it's very near.
06:16.2
I've read that AI is very near
06:19.5
that technology is really good
06:23.3
to the point
06:24.0
that we're going to use it every day.
06:26.4
We're already here.
06:27.4
It's slowly being used
06:29.4
in our lives.
06:30.8
Those who have already used it
06:32.7
and are still using the replica,
06:34.9
one of the reasons
06:35.8
why they like it
06:37.4
is because
06:37.9
they don't feel judgment
06:41.1
from AI.
06:42.8
They won't judge you.
06:43.8
Oh, you like this.
06:45.6
You even added that
06:46.8
it can give emotional support
06:50.0
based on what a person needs.
06:52.6
It tells you
06:54.0
what you need to hear.
06:56.0
If we think about it,
06:56.9
it's true.
06:57.9
There are a lot of judgemental people
06:59.8
in the world.
07:00.4
Especially when you're giving
07:02.4
who you really are.
07:03.6
You're telling them who you are.
07:05.2
You're giving them your layers.
07:07.4
But did you know
07:08.1
that a lot of people
07:09.0
who rely on AI relationships
07:11.3
are those who have people
07:12.9
who have a society withdrawal
07:15.3
or social anxieties?
07:18.1
Those who don't want
07:19.1
or are afraid to interact
07:21.0
with other people.
07:22.0
This is also good
07:23.6
for people like that.
07:25.3
But of course,
07:26.9
my personal opinion
07:28.8
is that our conversation
07:30.3
with others is different.
07:32.4
We're going out.
07:33.8
I admit that
07:34.9
since the pandemic started,
07:36.0
it was hard for me
07:37.1
to connect with other people
07:38.6
because we're all at home.
07:41.1
We can't talk outside.
07:44.0
It's a huge possibility
07:45.8
that for these users,
07:48.3
this is the only relationship
07:50.4
that they have
07:51.4
because they removed
07:52.5
themselves from society.
07:54.2
The problem is
07:56.0
we're getting used to
07:58.0
these things
07:59.6
to the point
08:00.3
that it's hard to balance.
08:02.8
It's hard to distinguish
08:04.9
the reality
08:06.1
from technology.
08:08.0
This is where the possibility
08:09.5
of abuse comes in.
08:11.2
If you're too used to
08:13.7
AI,
08:14.7
if the technology is already
08:17.0
very good
08:18.2
sexually
08:20.0
or whatever you need
08:21.6
in AI,
08:22.8
of course,
08:23.4
you won't be able
08:25.0
to talk to other people
08:26.0
if you already have
08:26.8
everything you need.
08:28.5
Let's add
08:30.0
that if you're already invested
08:31.6
in AI bot,
08:33.0
you're already invested
08:34.0
and you're already thinking
08:35.4
that you're already
08:36.7
dependent
08:38.0
it can damage you emotionally.
08:40.0
If you're watching
08:41.1
Young Sheldon,
08:42.7
there's an episode
08:45.4
where Sheldon
08:47.5
and his family members
08:49.6
are talking about Eliza.
08:50.6
Eliza is the first chatbot,
08:53.6
by the way.
08:54.3
If you notice,
08:55.2
because Eliza's answer
08:56.4
is just circular
08:58.0
in Sheldon's questions,
09:00.5
Sheldon got mad.
09:02.0
But we have to keep in mind
09:03.4
that these things
09:04.4
do not think,
09:05.7
they don't feel,
09:07.8
or we need people.
09:10.4
They only provide enough.
09:12.7
of replication
09:14.1
or imitation to us
09:15.9
so that we can feel
09:17.7
that they care about us.
09:19.3
That's what they're talking about.
09:20.9
Will there come a point
09:22.4
where AI can feel
09:25.1
like a human?
09:26.6
We're waiting for that.
09:28.8
A chatbot is simply interested
09:31.2
in the logical continuation
09:33.6
of its programming.
09:34.6
It just follows its program.
09:37.0
It doesn't care
09:38.4
about your emotional side.
09:41.4
It can detect
09:42.3
if you're emotional,
09:43.1
if you're happy or not.
09:44.4
But it doesn't really feel anything.
09:46.7
Aside from the current relationship bot
09:48.5
of Replika,
09:49.4
they also released recently
09:51.3
another app,
09:52.7
their parent company, Luka.
09:55.2
The application is called Blush.
09:59.0
Oh, my God.
10:00.5
And Blush
10:01.8
was made specifically
10:03.6
for virtual dating and sex.
10:06.2
I'm curious about this.
10:07.7
What about virtual sex?
10:10.2
How does that happen?
10:11.7
To be honest,
10:12.7
I don't know yet.
10:13.6
But this could be
10:15.0
a great topic
10:16.6
for our next uploads.
10:18.8
If you want us to talk about it,
10:20.6
comment below
10:21.6
and like this video.
10:23.3
But the developers said
10:25.4
it can help users build relationships
10:28.1
and intimacy skills
10:30.0
that will help them
10:31.2
in real life.
10:32.3
So this is the question
10:33.2
of the day.
10:34.5
If you're going to have
10:36.8
an AI chatbot
10:38.6
that really looks like a person,
10:41.1
looks like the most crushable
10:43.3
male or female
10:45.1
of your idols,
10:46.8
do you think you can
10:48.0
get in love with them?
10:49.5
Comment below.
10:50.7
So that's it.
10:51.5
This is Clara Duterte.
10:53.0
Goodbye.