Video by Johnny Harris: https://youtu.be/S951cdansBI
Support Johnny Harris by using his link! Use code JOHNNYHARRIS at the link below to get an exclusive 60% off an annual Incogni plan: https://incogni.com/johnnyharris
Subscribe to my other Youtube channels for even more content!
xQc Reacts: https://bit.ly/3FJk2Il
xQc Gaming: https://bit.ly/3DGwBSF
xQc Clips: https://bit.ly/3p3EFZC
Main Channel: https://bit.ly/3glPvVC
Streaming every day on Twitch and Kick!
https://twitch.tv/xqc
https://kick.com/xqc
Stay Connected with xQc:
►Twitter: https://twitter.com/xqc
►Reddit: https://www.reddit.com/r/xqcow/
►Discord: https://discord.gg/xqcow
►Instagram: https://instagram.com/xqcow1/
►Snapchat: xqcow1
If you own copyrighted material in this video and would like it removed please contact me at one of the following:
►https://twitter.com/DailyDoseofxQc
►dailydoseofxqc @gmail.com
#xQc #deepfake #ai
Support Johnny Harris by using his link! Use code JOHNNYHARRIS at the link below to get an exclusive 60% off an annual Incogni plan: https://incogni.com/johnnyharris
Subscribe to my other Youtube channels for even more content!
xQc Reacts: https://bit.ly/3FJk2Il
xQc Gaming: https://bit.ly/3DGwBSF
xQc Clips: https://bit.ly/3p3EFZC
Main Channel: https://bit.ly/3glPvVC
Streaming every day on Twitch and Kick!
https://twitch.tv/xqc
https://kick.com/xqc
Stay Connected with xQc:
►Twitter: https://twitter.com/xqc
►Reddit: https://www.reddit.com/r/xqcow/
►Discord: https://discord.gg/xqcow
►Instagram: https://instagram.com/xqcow1/
►Snapchat: xqcow1
If you own copyrighted material in this video and would like it removed please contact me at one of the following:
►https://twitter.com/DailyDoseofxQc
►dailydoseofxqc @gmail.com
#xQc #deepfake #ai
One of them is real and one of them is fake. I Want to show you two video clips. One of them is real and one of them is fake. Can you tell which one's real? Uh, left side.
Okay, this one was kind of easy. Obviously this is the fake one. Like you can see, the color on the face isn't quite right. Yeah, the shadowing is terrible wig.
but this video was like from a few years ago. Let's look at a more recent example. Oh wait a minute. look at these two clips.
One of them is real and one of them is real. Happ Because I saw the video. You tell which one it is guys. I'm eating chick filler right now.
Yeah, yeah yeah. but also not a lot of movement on the face is usually what they use to make things not look terrible. Okay, what about these left Faker Or these. Oh which one of is fake and which one of these is real? Oh real left side, left side.
Maybe you're some wizard tell but I couldn't and my guess is that most people can't That's because things have changed. You have never seen it quite like this. This technology ology is spreading rapidly. It's really mindblowing.
Deep fakes. Deep Fakes deep. T Cru was a Tipping Point For deep fakes, it's all real. In the last few years, we've crossed a threshold into a world where moving images are manipulated in ways that make them indistinguishable from reality.
This is our new world. It is the world of high quality deep fakes, the catastrophic potential to public trust and to markets that could come from Deep fake attacks. This is happening quicker than any of us could have imagined. So CH am I wrong in the thought that like saying it is good, making a laws on it is good, whatever.
but it's kind of hard to slow it down or stop it. I only only time will tell a like there's not much you can do like the technology is out there, the knowledge is out there, people know it exist, they can be done better and easier and they will do it. And it's like I want to get you up to speed on what's happening with why do I days entertainment and creating new challenges and fears for the people who make our laws and the people who enforce those laws make it look like anyone is saying anything at any point in time. Ultimately I Want to show you how much of a threat deep fakes actually pose and answer the biggest question of all.
what is even real. Seeing is not believing anymore. Okay, it's actually me. this is the real Johnny I Promise this is not a defect I Promise I'm not like going to do like some little bait and switch I need to tell you about the sponsor of today's video which is a thing I'm really excited about I'm genuinely excited not.
You can sign up and have 30 days risk-free where you can get all your money back if you feel like it's not useful to you. 30 days use the link than you and Cog guys I Watch this video I'm posting thing. Let's dive back into the wild, weird world of deep fakes. Okay, let's just be clear: we've been faking the moving image for at least 100 years. America Goes to War Thomas Ed Wanted to spice up his news reporting from the front lines of the Spanish American war back in like the end of the 1800s. So to do this, he shot some fake battle scenes in New Jersey and then cut them together with shots from Cuba making this look like it was happening in Cuba And it wasn't. and for the next 100 years Motion Picture Manipulation remained as crude and simplistic as Edison's sneaky editing. But then suddenly in 2014 with the invention of a new type of AI everything started to change.
The incredibly realistic, so-called deep Fi. How similar it is. Is there a way to know when this is fake and to tell when it's fake? We now have the ability to make people look like other people like I could be Johnny I think it's I could be Nick the studio manager look I'm Nick the studio manager right now I could also be Tom the music composer. And what's crazy is, even if you go back and watch what I just showed you frame by frame, you likely won't be able to tell.
Let me explain how on Earth we did this. This we can do all of this because of this clever Computing process. it's called generative adversarial networks or Gans this is where there was a tiny bit of uncanny value in that, right. I Think if they fix the neck and some of the lighting it would been since it's a tiny bit off.
There's a little bit of unny. Valley Two AIS work in tandem to get the best fake image possible. So you have these two. AIS One of them is a forger and the other is a detective.
The forger creates an image based on what you ask it to make and then it shows that image to the other. AI The Detective: The detective goes over the image and points out all of the reasons it's fake. It knows what to look for because it's been trained on hundreds, sometimes thousands of images of exactly what the finished products are supposed to look like. The forger is like, okay, cool.
let me try again and it goes away and it makes another image fixing the pixels that the detective AI pointed out. Then it shows it to the detective. The detective. AI Once again points out all of the weaknesses in the fake image and the forger goes back and makes a new one and over and over and over again.
And so the longer you give one of these AI models to train on, it basically goes through this creation and Improvement cycle over and over until you get them. CH I Feel like only a human can really kind of tell the human aspect of why it doesn't look human. and I think having maybe machine learning have the human feedback on why doesn't look real right? Integrating to this could even accelerate that. No.
Am I Wrong Chat something can be look really good and you could. There's something human about it right? Best deep Fake possible. There are many ways to train a deep fake, but this is the most common and it's actually how we got the Deep fake that we're running on this clip. right now. my face is slowly morphing into something else, right? Okay, and it's basically Pixel Perfect We had to run a training model on this face for 2 weeks but it got really good look. It's like amazing I'm not me I mean I am me. but but I'm not mean to you see the way the way I can put it. chat guys there a this check all un Kenny Valley It's when something is off and you don't know what it is but you know it's off right? It's a it's a tiny bit off the mark, even a slight amount and it makes it just you know right? I'm not me I mean I am me and there are things in there that give you that and that's kind of niut.
And and if you can Target it it scares. An important Point here is that we used to have to train these models on huge amounts of data. You need a lot of footage from someone's face to train these models, which is why it's mostly worked on like actors who have a lot of highquality face data available. It's how you get deep fakes like this.
You like what you're seeing, just wait till what's coming next or how you get Jim Carrey imposed onto Jack Nicholson's face in the shiny I'm going to hurt you or William defo in Silence of instead of Anthony look ter rud These last two clips were done by a super talented deep fake artist named Control Shift Face. He's actually the one who's do doing the Deep fakes for this video too. He's the artist behind that video that went super viral a few years ago of Bill doing an Arnold Schwarzenegger impression and turning into Arnold Schwarzenegger Get out of this theom get out and he's now working at a deep fake production company called Deep Voodoo where he worked on that Kendrick Lamar video. Yes that Kendrick Lamar video.
For me personally, it's exciting because I can like do this uh, goofy funny movie remixes that I can put on YouTube I Really enjoy that. And yes, when I was talking to him in our interview, he was using a live deep fake mask and changing between different characters and it kind of blew my mind. Oh my. God sorry you just changed, you just changed to Mark Zuckerberg That is just weird.
Wow. I did not I was not ready for that. Deep faking has already started to shape the entertainment industry filmmakers. Chad What if you do your online interviews at like at some places via zoom and you look better, you look more happy and like that and it's a more it's A It gives you a more compelling story on why you should be hired are using deep fake technology to translate films into other languages.
It's all my fault. It's this. they can deage celebrities I Thought we went anywhere and we're back. There's a six-part TV show in the UK about a bunch of Rowdy neighbors and guess what? all of these characters are all deep faked celebrities did move some of Kim's things sorry about two years ago if someone told me something like this would exist I Don't think I would believe him and what will be next in like another two, three, five years. It's getting to the point where deep fakes are nearly impossible to decipher as computer generated which is super exciting but also kind of scary. The FBI tells NBC News they're following the rapidly developing technology closely. I Believe that this is the next next wave of attacks against Amer Real concern. It's a real concern.
And this is where we talk about the Doom and Gloom part And how is exciting Because then then you can have more content that normally you wouldn't be able to do right. On the basis of that, it breaks the limitations on some production and accelerates the thing and it widens the amount of Comm you can make out of one piece of media one person or uh, so there's good and there's bad, but you think there's no good is wrong? You could. It expands and accelerates like crazy. And this is.
We talk about the Doom and Gloom part and for that, yes, I have a lot of paper. Lawmakers and law enforcement are getting worried about this technology. Here's a letter from Congress to the Director of National Intelligence Raising the alarm that hyper realistic digital forgeries popularly referred to as deep fakes use sophisticated machine learning techniques to produce convincing depictions of individuals doing or saying things that they never did. I Kind of love when Congress talks about technology and this is the most important line.
By blurring the line between fact and fiction, deep fake technology could undermine public trust in recorded images and video as objective depictions of reality. Oh, and by the way, this letter oh now it's a problem. was back in 2018. basically the stone AG for AI image generation.
More recently, I saw a Democratic senator giving his opening remarks to his colleagues and Congress via a deep faked voice that sounded been a problem. What the perfectly like him, we have seen what happen. I I just my coffee on. But it's not just American Lawmakers Europool which is the European International Police agency says that experts estimate that as much as 90% of online content may be synthetically generated by 2026.
90% meaning AI will be making most of the stuff we watch. Their big fear in all of this is that deep fakes will lead to a situation where citizens no longer have a shared reality, causing what they call an information apocalypse. Here is a 43-page report from the US Department of Homeland Security and look at this title page. I Mean look at that graphic design, especially this moment where it goes from a classic sand serif to a siif.
I mean not bad for the government. DHS Says that deep fakes and the misuse of synthetic content pose a clear present and evolving threat to the public across National Security Law Enforcement Financial and societal domains. True, the Pentagon is using its big research Wing The one that helped invent I Don't know the GPS, the covid vaccine and the literal internet. That one to look into deep fakes Chad Just think Chad Just think about just think about a um uh identity theft right? You could really have somebody. You can pretend to be somebody else right? Right and have a Data Bank of their of their face and be like okay I'll do a video or uh I'll do a a picture with a thing that says it's me or whatever and you can easily recreate that and how to combat them like they're taking this very seriously. But my big problem with all of these reports is they're couched in such vague language. What can deep fakes actually be used for outside of making really cool? Kendrick Lamar Music videos and Tik Tok Tom Cruzes I think there's bubble Dum Yeah, you can tell that I don't look no further than Ukraine The Russian President says a military operation is now under way UK column of Russian armor Crossing into Ukrainian territory from the north Thank this is not Ukrainian President Vladimir Zilinski Bro this is bro bro. come on man bro.
the the top of the head looks stitched on the the lighting makes no sense at all. Dude dude, just look at. just look at the shadows and lighting. It's dog F video that there's more.
there's more Shadows on his cheeks than there is on his neck and it's under his chin. Appeared on a Russian language Ukrainian news site just 4 weeks after Russia Invaded Ukraine Russian Troops were trying to take over Kiev which would have meant a massive victory for Putin Information on the ground was scant and foggy and this video pops up of Vladimir Sininsky urging his troops to surrender to Russian forces. Now listen, this is kind of a crappy version of a deep fake I mean shape and head. Weird accent and voice wasn't super pulled off.
but guess what? It doesn't matter because what this video did do is it made people question every other video coming out of Ukraine And this is what Congress meant when they were freaking out in 2018 about the idea of deep fakes blurring the line between fact and fiction, undermining trust in recorded images and videos as objective depictions of reality. And this is actually so far in my reporting on this, the biggest takeaway. I Don't know if we realize what a big shift this is as deep fake technology gets better. Yes, it allows people to create compelling fake evidence and that is worrisome.
but it also allows people to dismiss real evidence real footage as oh, that's just a deep fake. We can do that now. we can't trust anything. Uh, it's not just yeah.
but was it ever n you know what? Chat Guys I is almost upon us I Still don't think it's a bad thing. I Still don't think it's a bad thing. I Believing everything is real okay is just everything is fake. Okay, when you're stuck in in in the middle in some gray area, then there's then there's damage they could create. It's the doubt that is cast on authentic. I Think doubting is fine audio. Okay, let's get to another verification to get us away from all the vague language of government agencies talking about how scary this all is Think about the legal system. Yeah, that's why I said it.
Deep fakes are becoming a nightmare for evidence in court. In 2020, there was a child custody case where the mother mother presented as evidence audio recording of the child's father saying violent things over the phone. It was submitted as legit evidence as proof To be like this: Dad is unfit to have the kid. But after some digital forensics, it became clear that the mother had used an online tutorial and some cheap software to doctor the audio file to add in fake violent words.
Most judges and juries are not ready for this. Most wouldn't think to question evidence like this that sounds like a Smoking Gun and yet was totally fake this sort I I Think what metadata is super important of technology is getting so accessible and easy to use. In fact, this Vo that you're listening to right now is actually fake. I made it using a cheap and easy software widely available to anyone.
Now imagine that you're on a jury and you're presiding over a criminal case and even though there may not be any manipulated evidence in front of you, there's now a nagging voice in the back of your mind that's like remember that time that Johnny tricked me in that YouTube video and he made his face change and now he looks like or or or the defense lawyers. Whatever point at the fact it could be fake or even hint or just put the idea out there in a random question and now it's in their heads. someone else and it wasn't really him and it was really convincing. Who's to say all of this? Visual Evidence that I'm seeing in court isn't also fake Visual Evidence is no longer.
Rock Solid Surveillance footage, body cam footage, heck, even audio taken in a bus from a presidential candidate. This was all solid at one point and now it can all be called into question. And then of course, deep fakes are being used for good, oldfashioned cyber crime man cyber crime. It just sounds so quaint.
Cyber Crime Elon Mus was recently faked to help shill a new crypto scam F.net will help Ordinary People to gain Financial Independence And yeah, this one's obviously fake and poorly done. My point is, they're getting better and some scams are already way more persuasive. Like this group of frauders who were able to clone the voice of a major Bank director and then use it to steal $35 million in Cold Heart Cash $35 million. That's a lot of money just by Deep faking this guy's voice and like using it to make a phone call to transfer a bunch of money And it worked okay.
but in reality, the risk of eroding trust in the public or weakening our legal system or giving cyber criminals a new weapon all of these are actually the rarest uses of deep fakes. The main victims of this new technology, at least at the moment, are women. These sexually suggestive ads are popping up online. Deep Fake? That's not me. By one estimate, 96% of deep fake production is used Classic News site. Dude, this person does not want to be sexualized in in a video. and here's the intro to the video and shows her going down like what the is wrong with the media these days. Man Porn almost all of which are using women who have not consented to this.
Just 2 years ago telegram got in trouble because it was found that they had private groups on their service that were using deep fake technology to remove the clothes of more than a 100,000 women. It used to just be the faces of celebrities because they're the ones who have all this high quality data out in the world that you can train a model on, but no longer. The tech has developed such that an ordinary person with a few images on social media could suddenly have their faces deep faked onto the body of a porn star. Martin has found dozens videos of herself.
She has no idea who's responsible for them. Okay, but what's being done about this? What can be done? Let's first remember that when a new technology comes along, it typically evolves rapidly. Way faster than the lawmakers who need to understand it. Chat, sorry, chat.
sorry to pause again. Chat guys guys. What? What if people get so Advanced that they're that they're putting something that makes an account that takes that takes people, does that to the images and then send it right Then then they create a proxy of of of liability. Oh well.
I didn't do it. This thing did it right And it and it posted it. I Mean it creates another layer I Mean it's it's not them doing it to regulate it, Doe. So we're in the kind of Wild West phase where the lawmakers are kind of just trying to get their head around this stuff.
Let's hope that Congress can catch up. China is actually the first country to have regulated what they call Deep Synthesis Technologies by requiring all deep fake content to be clearly marked as having been modified. Luckily, a surveillance state is a lot easier to regulate this kind of stuff, but the EU is trying in an ambitious bill called I Think that was a good point? Boy do I love going through European Commission PowerPoint slides like China The EU is trying to make it so that you have to label deep fakes. but of course, in a western Democ Ry you have to put this line unless necessary for the exercise of a fundamental right or Freedom or for reasons of public interest.
Oh man, those pesky rights always getting in the way of being able to regulate. The UK is also trying to figure this out. They are targeting the porn situation. The one that is the vast majority of the uses of deep fakes.
Doing deep fake porn without consent comes with a penalty of prison. Chad You have to be if people have to be careful. Okay, oh yeah, this is good. This is good. This is good. but then at the same time there's sometimes there's collateral damage. Yes, St You has to play caution though cuz some legit hard to enforce this stuff that's might. Will the government take urgent action and repair this mess? And then there's tech companies like here on: YouTube They say that they will pull down any video.
That quote includes deceptive uses of manipulated media. I.E Deep fakes which may pose serious risk of harm. Who decides what serious risk of harm is? But one promising solution here would be to fight software with software. If we could train software to detect the quirks of deep fakes like the unnatural blinking or some of the other weird subtle things that the human eye can't pick up on, we would have a strong way of verifying what is chat.
Wouldn't this be exactly chat Like cheats and anti-cheats where once one wave has been found then it's GG Right then then it's GG Because once you once you found what makes it bad and then you finally you you punish or you revert then they know what does that right. It's the same thing as anti-cheats is real footage and what is distorted so deep. Fakes are here and they're seemingly here to stay. We are entering a new chapter of how humans consume and relate to information.
It's a chapter as significant in my mind as some of the other big chapters Like when we learned to capture light and record it, it changed the world. Photographs could be used in court as proof. They could be used in science to capture and understand the world around us. In journalism.
It allowed us to show, not just tell, And when we made those images move, we could capture more truth, more proof, more evidence, con, and spread those images far. Every for the years evidence coursed through the world's internet. This evolution of the moving image and it uses in the connected world is taking an expected turn away from being the Bedrock of evidence and towards the foggy territory of deception and confusion we're going to have. You can say something isn't fake.
You can say that it also is fake, right? What? What? If there was something real you can say. Well, it was fake it. It was fake. but it wasn't.
To navigate this chapter carefully, we're going to have to push even harder to know where our images came from, who made them, and for what purpose, and we're going to have to resist the urge to believe everything we see, no matter matter how real it looks. I'm watching. Well, I'm watching I'm not listening. it's just at the end.
It was just words chat that was interesting. It is July 17th, 2023 and you watching the code report. The internet has a big problem. When you look at an image.
how do you know if it's real? This image of the Pope tricked millions of grandmas worldwide. When you hear someone's voice, how do you know it's real? My real voice. it's a professional clone from 11. Labs My biggest fear as a digital Creator is that tools like this will be used to fully replicate my online Swagger and make me irrelevant. It's already happening to people like I Legit thought that Greta launched her own Oil Company recently hello, my name is Greta Welcome to company I Love how it is pumped out of the ground I'm a huge fan of the time as you can see here in this hyot that borrowed my deep fake animatron to make it look like I'm not AI generated. Luckily though, there's a new Coalition in town that includes friendly Mega corporations like Adobe and Microsoft who have stepped up to rescue us from these generative AI problems that they created. It's called the Coalition for Content Providence and authenticity or Ctpa. Not many people know about it yet, but it should be on every developer's radar because it could change the Internet as we know it.
Basically, it's a spec or set of guidelines for both hardware and software providers that would attach metadata to every media file like images, videos, audio, and so on, then use cryptography to digitally sign it, making every file tamper aware. The idea is to make it impossible to change a pixel without the provence of those changes being recorded on the file itself or to a manifest that is permanently attached to that file. The first record will happen on the camera where the photo is taken. Then when you go to edit Photoshop another record will be logged and signed there.
It's almost like every image will become an Nft minus the blockchain. You'll be able to click on this icon to inspect the Providence information to determine if that image was generated by AI or determine if it comes from a trusted news source like BuzzFeed or Infowars. This technology is already here, and just last week, the Coalition was urging the US Senate to put laws on on the books that would make this technology essential and we proposed Congress establish a new Federal Anti- Impersonation right that would give artists a right to enforce against someone intentionally attempting to impersonate their style or likeness that actually sounds great as a Creator I Don't want to be impersonated and as an end user I Want to make sure that I'm consuming authentic content and it would be very useful to companies like Valve who have recently started Banning Games on Steam that are suspected of using AI Gener I mean people stealing copy Styles Since the beginning of Art yo like it's just going to how it is content that could potentially violate someone's copyright and other top companies like Stability AI are abusing that our platform can be digitally stamped with metadata and watermarks. Welcome Adobe's leadership in driving the development of some of these.
Open Standards And not surprisingly, the US Department of Defense is on board because they believe this technology can help surface Bad Actors like Steven Seagull or anyone else creating horrible synthetic content Sounds pretty awesome. and I Love when big corporations team up with the government to keep me safe, but when looked at from another angle, this could be viewed as a M surveillance apparatus. In the future, it may be impossible to change a pixel on the internet without leaving a digital footprint for anity, but it talks about how this technology could be used with digital IDs issued by the government. That will make it far easier to figure out who's creating all these memes that are offensive to our dear leaders. When this technology is combined with a digital currency and social credit system, we could easily shut down the meme Warriors internet access and reduce their allowance of lab grown meat to just 12 ounces per week. In addition, it would give the establishment a monopoly on disinformation hypothetically could create all the AI generated content they want while making it look trustworthy and the vast majority of people out there will believe whatever authorities tell them like. If this image had a NASA provenant signature on it, almost everybody would believe that we went to Mars even though it's not a real place you can go to. In 1981, the CIA director said we'll know our disinformation program is complete when everything the American public believes is false.
This has been the code report. Thanks for watching and I will see you in the next one. Much love SKR r r it is June chat that was interesting. Ver Verium looks good.
No man, this deep fake is mad scary yo. this is X x on the beo Okay, my voice as well that is anyone knows that boy I don't know, he's just so s anyone knows that boy I don't know, he's just so sor.
👍
just deep fake so deep that it starts to become deep real
so much misinfo in those 2 videos….
Xqc deep fakes his hair.
bro gave us 3 seconds to analyze 2 videos at the same time to impress us over this worn out topic… NAILS!
pixels stay pixels
Easy to solve if you understand Blockchain or any other technology encode videos using a specific sequence and playback using only that sequence any edited videos would throw up an error this is both good and bad because it stops deepfakes but governments use the same techniques to lie to the public so all news media would have to be unedited and live no more 100 shoots until you get the best one
Who needs Deep Fakes when you have xQc cloning capabilities
wait i watched that police video like 2 days ago or something its fake video? now i dont trust any video upload anymore
It's easy, every country should put a law where if u make deepfakes cops can use lethal force at any time and all of a sudden most people will stop.
Yo can chat have a separate channel and remove xqc from the vid?
God xqc always stating his unneeded opinion