Hey there, fellow entrepreneurs! I’m Terry Brock, your host on Stark Raving Entrepreneurs, and today, we’re venturing into some very serious territory—the world of Deepfakes. This isn’t the stuff of science fiction anymore; it’s real, and it’s happening now. Artificial intelligence is no longer just about optimizing our work ethic or crunching data. It’s evolving to a point where it can mimic us in ways we never thought possible.
This is slightly different than what we often focus on for Stark Raving Entrepreneurs. Usually we deal with marketing and content creation areas. Those are vitally important and we will continue to address them.
At the same time, we’re hearing from many of our clients deep concern over AI and deepfakes. So, I reached out to a friend who happens to be one of the top audio and video forensic experts in the world for his advice.
In this episode, I sit down with Mike Primeau, a top expert in audio and video forensics from Primeau Forensics. We talk about the chilling capabilities of AI to replicate human voices and likenesses so convincingly that it can fool our own family members, let alone a computer algorithm. Imagine getting a phone call from a loved one who is actually safe at home, only to believe they’ve been kidnapped, just because an AI was clever enough to deceive you. That’s the kind of threat we’re facing.
Mike dives deep with me into the forensic side of things, showing us just where technology is at today with detection and verification, and revealing the realities law enforcement and the legal system are grappling with now. It’s not all doom and gloom—there’s promise on the horizon—but we’ve got to be smart, savvy, and skeptical in ways we’ve never had to before.
So, entrepreneurs, tune in as we tackle the tough questions, the potential dangers, and yes, the necessary safeguards that come with AI’s dark side. Our businesses, our safety, and even international security might just depend on it. Let’s get started.
First, here’s a gift for you to better understand AI and build sales – Recommended AI tools for business growth.
http://AItools4biz.com
Here’s Your Video Link:
https://youtu.be/_UBZw5IxqLE
Listen to this & other episodes on our podcast
https://bit.ly/sre_podcast
ABOUT
We’re Stark Raving Entrepreneurs. We help you with the lifestyle of “Live and Let Live.” Do whatever you want, but don’t hurt others and don’t take their stuff. We embrace what is called the “Non-Aggression Principle” (NAP).
For your convenience here are some time stamps of note-worthy content for this episode:
TERRY
Hall of Fame keynote speaker Terry Brock is a globally connected leading authority who works with organizations that want to leverage technology and social media for more customer engagement, productivity, and increased profitability.
Terry is the former Chief Enterprise Blogger for Skype, former Editor-in-Chief for AT&T’s Networking Exchange blog, and former Chief Retail Advisor for ACE Hardware.
A master at his craft, Terry earned the Certified Speaking Professional designation from the National Speakers Association and was inducted into the Speaker Hall of Fame. He was inducted into the Virtual Speakers Hall of Fame and he was recognized as a Legend by the Veteran Speakers Association.
Terry is also a Cavett Award recipient, recognized as the National Speakers Association’s “most cherished” award (only one per year).
GINA
As the CEO of Stark Raving Entrepreneurs, Gina Carr works with business leaders to leverage AI-powered marketing for more impact, influence, and income. Gina has an MBA from the Harvard Business School and an engineering degree from Georgia Tech. Known as “The Tribe Builder,” Gina helps passionate people build powerful tribes of raving fans.
A serial entrepreneur, Gina has created several businesses, including an award-winning real estate company, a chain of community magazines, and Video Rock Starz. She is the CEO of TEDxDupreePark. A native Atlantan, Gina now lives in Orlando with her sweetie Terry Brock. Gina is a passionate advocate for animals, freedom, and plant-based living!
Here are the ways to work with us here at Stark Raving Entrepreneurs:
===========================
? ARE YOU A CONTENT CREATOR? Get this popular guide : ?? https://terrybrock.com/5k
? Join our Stark Raving Entrepreneurs Community TODAY at: http://starkravingentrepreneurs.com
===========================
MISSION
We’re a channel devoted to those with an entrepreneurial spirit who believe in freedom and liberty. If you like the idea of living a voluntary life, not initiating force or coercion, and living life peacefully, abundantly, and making a lot of money, this is the place for you!
We look forward to hearing from you and getting your opinions and thoughts. Drop us a note at Terry@TerryBrock.com or Gina@GinaCarr.com.
Thank you for joining us today.
Terry Brock & Gina Carr
===========================
Schedule a Strategy Session with Terry
Schedule a free 20 minute, no-obligation, Zoom call with Terry to look at your marketing and technology.
http://20WithTerry.com
Schedule a Strategy Session with Gina
http://bit.ly/30minswithgina
are some timestamps for this episode that you might find helpful:
00:00 Computer systems need human oversight for accountability.
03:28 Progress in manipulation detection; challenges with AI.
07:38 Legal community AI research, development, and disputes.
10:41 Law enforcement relies on phone data for investigations.
13:41 Cooperation between technology companies and law enforcement.
17:09 An intriguing conversation about the Cuban missile crisis.
21:49 Admissibility of courtroom evidence based on forensic science.
22:46 Authenticating digital images involves understanding technology.
For your convenience, here is the Spanish Language Summary
En este episodio del podcast “Stark Raving Entrepreneurs” titulado “AIDeepFakes-What You Need To Know.m4a”, los anfitriones Terry Brock y Mike Primeau discuten el impacto de la inteligencia artificial (IA) en el mundo del audio y video, asà como los beneficios y peligros de las deepfakes.
Terry Brock introduce el tema destacando cĂłmo la IA puede clonar voces y mejorar videos rápidamente, pero tambiĂ©n menciona los riesgos potenciales como la manipulaciĂłn de la verdad. Mike Primeau, un experto en forense de audio y video, se une al episodio y explica cĂłmo la tecnologĂa se puede usar tanto para bien en la comunidad legal como para fines maliciosos, incluyendo manipulaciones que pueden ser engañosas en juicios.
Primeau tambiĂ©n proporciona detalles sobre cĂłmo la IA ayuda en la bĂşsqueda de sospechosos o vehĂculos, la protecciĂłn de identidades, y destaca los desafĂos en autenticar audios y videos debido a la mejora constante de la IA. El episodio continĂşa con una discusiĂłn sobre el uso responsable de la IA, y Primeau sostiene que es crucial comprender cĂłmo funcionan las herramientas de IA para que la evidencia basada en ellas sea admisible en un tribunal de justicia. El episodio tambiĂ©n explora escenarios hipotĂ©ticos de seguridad nacional y experiencias personales de Terry Brock para ilustrar la importancia de la verificaciĂłn y la comunicaciĂłn en casos de emergencia.
Finalmente, Terry y Mike hablan sobre cómo las herramientas de IA como Claude 3 pueden ayudar a identificar imágenes generadas por IA, aunque Primeau enfatiza que el procedimiento forense exige un rigor que asegure la veracidad y precisión del análisis. Mike comparte formas de contacto para quienes estén interesados en la forense digital y recomienda el sitio web SWGDE como recurso para conocer
Also for your convenience, here’s is an unedited transcript:
Terry Brock [00:00:00]:
The field of AI is changing our world profoundly. We’re finding many, many ways that it’s helping us in marketing and business, makes it easier, and does a lot good there, It also has a lot of ramifications in audio and video. There’s a lot of positive that can come from taking your voice and being able to clone that and be able to say more that you want. We, as creators, often will wanna do that. Video, to create videos even better and faster. Hey. There’s some good with that. There’s also a flip side.
Terry Brock [00:00:28]:
The flip side is there’s some dangers, and we’ve got some problems. So what we’re doing today is we’re gonna give you a chance to find out from an expert on what’s going on. Joining me right now from his offices in the Detroit area is Mike Primo, who is a forensics expert working with audio and video with Primo Forensics. Mike, good to have you with us today.
Mike Primeau [00:00:48]:
It’s an honor to be here, Terry. Thank you for having me.
Terry Brock [00:00:50]:
Well, you see a lot that’s going on. I know you’ve been working with this your company for many years, listening to an audio and go, is that real or is it not? And, or that video, what happened there really? And sometimes you work with, law enforcement and you’re working in trials to make sure, okay, is this real or not? Tell us a little bit about where we are just from the big picture point of view with audio and video detection and use, and when we can go in from a forensics point of view and decide, is this real or not? Where are we today with the technology?
Mike Primeau [00:01:22]:
Yeah. Yeah. That’s a good question. So 30,000 foot view is that artificial intelligence models can be used for, useful components and investigative means within the legal community, as well as being used as a weapon, you know, to to demonstrate things that aren’t necessarily the way that they originally occurred. And the the major concern from a scientific community perspective, which in the last couple of years I’ve become, more and more involved with, with a group called SWGDE, which is the Scientific Working Group on Digital Evidence. They generate standards and best practices for law enforcement, in the in the space of digital forensics. So audio, video, image, as well as digital forensics, so cell phones and computers and that kind of stuff, is that we we need to understand the methodology. We need to understand the intricacies of how the tools work in order for be it to be admissible in a court of law because the legal world is still and and has been, human centric.
Mike Primeau [00:02:25]:
So when you have computer systems that are making determinations for humans, a human needs to be able to testify about those components of what’s actually happening underneath the layers. And, again, that can be used for it can be useful and in good intentions and in good ways to be able to do things like video content analytics. So SWIG actually has a standards document. I can actually send you that that references some of the ways that law enforcement can actually use artificial intelligence for the good, like scanning video recordings to find a specific suspect or a vehicle or, performing, like, video redactions. So to automatically find areas within a video recording to mask as well as, audio wise to protect the identity of victims and protect the integrity of the investigation, things like that. And then we flip to the to the the bad side, which is, manipulation. You know? And that’s a strong word that we use within the scientific community that has a definition that essentially means open to erroneous interpretation. So making changes that can be miss misleading to a a trial of fact, a judge, and a jury.
Mike Primeau [00:03:28]:
So with respect to manipulation, we have made significant strides at being able to detect those things with audio, with well, with with audio, but in the in the recent years, it’s become more difficult increasingly difficult because how how well AI is doing to generate voices and things like that. But with video and imagery, much more significant strides because there’s more of it. That’s really what it comes down to. There’s more surveillance and more video recordings on sites like YouTube and things like that. But we’ve also seen a surge, and I I won’t go too far down this road because it’s a sensitive topic, with how these affect, crimes with children, specifically with images and video has become an increasing concern within the scientific community. But the tools are doing very well-being able to detect those things. With respect to the audio portion, there is a distinct unknown variable that still exists to be able to identify whether or not that’s someone’s voice. So with voice comparison, it’s actually a process that we’ve backed away from a little bit for for the time being until more research and development has been done into these tools from the private sector.
Mike Primeau [00:04:36]:
So law enforcement still has access to a lot of the tools that were used for voice comparison, but, unfortunately, we don’t have access to those tools yet. Again, to make identifications, which in our world, an identification is to the exclusion of all others. Right? So if you’re gonna get
Terry Brock [00:04:49]:
to Right.
Mike Primeau [00:04:50]:
100% certainty on something, it has to be based on a test that you provided that eliminates all other possibilities. So with voice comparison yeah.
Terry Brock [00:04:59]:
Go I was just gonna say, it would seem like we can have AI come up with the right criteria to say, yes. We can say this is Bob’s voice. We’ve checked Bob’s voice with other legitimate recordings. We’re hearing this new recording. Yes, I can say that is Bob, or at least we have a 99.999% chance. Is that correct?
Mike Primeau [00:05:19]:
AI definitely could be useful in that regard. Yeah. To be able to complete those models because from a a statistical probability perspective, if we get to say that it’s, you know, 95% more likely than that or, you know, the strength of that opinion is based on a known dataset because that dataset is so large, theoretically, yes, AI would be very beneficial to be able to, you know, examine those things and go, yeah. I’ve compared these components that you’ve fed me as the criteria to arrive at an opinion, and this is, you know, where we got. Now there’s a a new pretty useful tool that we’ve been, testing out. It’s fairly new from one of our, our vendors called AMP, which is a license plate identification tool. So it uses
Terry Brock [00:05:59]:
a AMP?
Mike Primeau [00:06:01]:
I’m sorry?
Terry Brock [00:06:02]:
Is that AMP, AMP?
Mike Primeau [00:06:04]:
AMPED. So in Italy, they refer to it as AMPed, but in the United States, we call it AMPed. Okay. But it’s a it’s the leading image and video analysis tool for forensic use that, they’ve recently come up with something called DeepPlate, which uses an artificial intelligence algorithm to, examine pictures that we give it to to arrive at a statistical probability of what they think, what it thinks the characters are, those numbers and letters, which is a a handy little tool that law enforcement’s been using more. So, it it it is something that we have our hands around as a scientific community, but we’re we’re a little slower to the uptake with how advanced AI has become in the last couple years with respect to audio.
Terry Brock [00:06:45]:
Yeah. In one hand, that concerns me because I think, hey, this is something that’s moving fast. You know far better than me how fast it is moving. We see it from the marketing and the commerce side, how fast it is moving literally every day. There’s something new that happens in the AI space, and I’ve always been concerned. I’ve always often thought in courses that I teach, I’ve said something. I wanna get your, professional opinion on this. I said one of the best things we can do is not stop AI, because say we can’t we can say let’s pass a law and not do anything.
Terry Brock [00:07:14]:
We’re gonna have a really hard time persuading the Russians and the Chinese and the Venezuelans and the North Koreans and the Iranians to stop all their AI activity. And I don’t think we’re gonna be doing the right thing if we just hold it. But I think what we need to do is to put the right guardrails in and then let AI check AI to make sure that it is being correct. What are your thoughts on that?
Mike Primeau [00:07:38]:
Yeah. No. It’s absolutely accurate with respect to the legal community, which is where I live, you know, most of the time. I know that you’ve done a lot more of that research and testing on the on the marketing and the technical side. And there’s a lot there’s just so much that we’re we’re talking about just the analysis of material, but there’s a lot of document, you know, the the the copy generation that AI is using. But I also read an interesting study to that point, yesterday about how 2 major companies, I don’t remember which ones are fighting over litigation, about how their artificial intelligence model used the other company’s content to develop itself. So they’re saying, hey. You know, you owe us for using all of our blogs and all of our content on our websites to generate your AI model that’s now doing x, y, and z.
Mike Primeau [00:08:23]:
So it’s it’s interesting to see how it’s it’s getting into that litigation component where they’re actually trying to set up those guardrails to say even how the AI learns needs to be, policed in some way, shape, or form. But, yeah, absolutely. There there has to be something that says, you know, whatever the the tool is doing, the the this is the means. The and and the expert testimony courses that we take, it’s called staying in your lane. So we know as an expert, you know, we don’t try to like, me as a video analyst, I’m not gonna testify about an accident as an accident reconstructionist. You know, I’m not trained to do that. That’s not my qualifications. But if there’s a dash cam involved, that’s where my expertise, you know, applies to that legal scenario.
Mike Primeau [00:09:04]:
So, yeah, similar concept. There needs to be boundaries. Absolutely.
Terry Brock [00:09:08]:
So that’s something that we’re all concerned about. We say, okay, it could be fake or not. Matter of fact, we’re familiar with this. I’m sure you’re familiar with the story of, mother and father who got a call from the voice that sounded like their daughter saying, I’ve been kidnapped. They need x number of large money. Please send it right now. What are we going to do? And, they were very concerned because it sounded like her. This is her mother and father listening and said yes, this is our daughter and she needs it She’s been kidnapped and then something happened.
Terry Brock [00:09:37]:
I forget what it was, but they found out she was actually in her bedroom or something, but there was bad guys doing this, so they didn’t send the money and they got out okay. But that kind of scenario seems like it has happened. I’m sure there’s many other cases where it has. What can we do to understand how to react to this and outside of being scared to death? What what would you recommend as next week?
Mike Primeau [00:09:58]:
Pretty scary. Yeah. No. As as an investigator, I would say that that’s a perfect example of how to actually authenticate that scenario. Right? So with respect to analysis of the media for authenticity purposes, we have authentication best practices that we examine source structural information that says, hey. Does that file look like it came from the way that we, you know, allegedly it had come from? So with respect to videos, you know, something surfaces on YouTube, we obviously know that its source came from some sort of camera. So we have to, you know, be able to see if that source information is contained within that file or traces that that camera technology left behind when it was creating that stream. For calls, that can be increasingly difficult because call routing can be done.
Mike Primeau [00:10:41]:
We don’t know exactly where that call is coming from, in which case law enforcement may be limited in their efforts to be able to, you know, respond to that in an appropriate manner. So it’s a great question, but a very thorough investigation of, well, let’s examine her phone. Let’s figure out is, you know, is her last known location according to the cloud showing something that’s reliable and, you those kinds of things. So because of the abundance of information that are contained in phones, those can be very useful if we have access to the phone or access to the cloud. What we’re seeing a lot more of now with we have a partner company in Troy that we work with here locally, that collects and does our cell phone and and computer forensic analysis that we work closely with. A lot of cases that we’re examining where we’re collecting the devices, they’re actually saying, hey. This isn’t good enough. We need to go to the cloud.
Mike Primeau [00:11:26]:
And it’s it’s a it’s a godsend in a lot of cases that we have that that cloud information that we can extract for that user because there’s more hiding there than there is in the actual device in a lot of situations. Yeah. So that becomes increasingly useful to go to those sources. So although, you know, in litigation where we’re talking about, let’s say, a criminal case and we need to analyze an audio recording or a civil case that’s got 100 of 1,000,000 of dollars at stake because of robocalling or something to that effect, authentication best practices really kick in to see if we can determine, does this look like something that came from that type of device, or does it look like something that uses some sort of software to create that file?
Terry Brock [00:12:05]:
Yep. I think that’ll make the difference. Well, let me hit you with something that I’ve, con concocted on my own. I often talk about in classes and in courses where I’m speaking. Sometimes people are a little bit concerned. Let’s suppose, for instance, Mike, you are now president of the United States. Wonderful guy like that you are. We made you president of the United States.
Terry Brock [00:12:21]:
You’re in your Oval Office. You’re working on some matters. Deep thought on that. Suddenly, the door bursts open. In walks about 5 to 10 of your closest, most trusted aids. They’re saying, mister president, turn on your phone right now. Putin is on there saying he’s about ready to launch missiles against the United States. It looks like him.
Terry Brock [00:12:41]:
It sounds like him. What do we do, mister president? You know that within 15 minutes, if he really does launch those missiles against, let’s say, New York, LA oh, and Detroit, just because you’re from Detroit, now you’re gonna take, you know, in 15 minutes, it could be over. Or if you if it’s not him, and you launch first to take out Moscow, Saint Petersburg, or whatever else, then you’ve been the one that initiated it, and they’re gonna see the missiles coming. What do you do in that case, mister president, when you get that report?
Mike Primeau [00:13:12]:
That is a that is a complex yeah. That’s a unique situation.
Terry Brock [00:13:17]:
And unfortunately, today, that’s not just hypothetical. I would have said 10 years ago, yeah, but that’s probably not gonna happen. Today, oops, when we have Putin saying we’re not scared of using nuclear weapons, we could do this if you push us too far. And so, I know it’s scary, but I think we need to kinda go there at least, examine a little bit and think about it. What are your thoughts on that?
Mike Primeau [00:13:41]:
So the first thing that comes to mind is, you know, I I apply it to law enforcement. I’m not a law enforcement officer, but I’ve I train with and work closely with law enforcement these days, with respect to forensics because they are kind of on the bleeding edge of that because of how much they have to deal with, you know, with a lot of the multimedia that’s presented as evidence. And one of the things I I really like is companies like Ring have a portal specifically for law enforcement. So So if they have an issue where, let’s say, they’re canvassing a neighborhood for a homicide that occurred and they notice there’s 10 rings throughout that that street, they can then go to the Ring portal through order of a a search warrant that’s signed by a judge to collect all of that data from their server. So it’s like a cooperation that Ring has with, law enforcement specifically. I I feel like there would have to be some sort of cooperation that the presidential office has with the resources that they’re collecting that from. Let’s say they found it on YouTube. It would be ideal in a perfect world that, you know, I would have the ability to contact Google and say, hey, we have a relationship.
Mike Primeau [00:14:44]:
I need to see that file and send it to my advisors so they hear if it looks like something that Putin actually said or not.
Terry Brock [00:14:51]:
And we need to do And this could be real time. It could be we’re seeing it live. It’s a YouTube live going out.
Mike Primeau [00:14:56]:
Yeah. And if it’s a live stream, yeah, definitely. Then YouTube, you know, if there’s something that they can, offer with respect to where those servers are communicating that information and, you know, cybersecurity professionals were involved to help, you know, kinda understand and verify those things. I I think it really comes down to the cooperation that Mhmm. We have with these resources. So take our AI for example. You know, having these communications with companies that own these tools and saying, hey, you know, we have a serious concern here for what’s going on. We need kind of a backdoor from a from a government or a law enforcement perspective to make sure that, you know, we can act accordingly in the timelines that we need to act to investigate.
Terry Brock [00:15:35]:
Yep. And it would seem like that. And cooperation, I would say, also would need to be between the various governments to say, hey. Wait a minute. I know that for instance, they have I I think it’s a red phone. I’ve not been there. I don’t know. But a way that the president of the United States could pick up the phone, get directly through to the leader of Russia or China or wherever and say, hey.
Terry Brock [00:15:54]:
What’s going on? Is this real or not? Let us know and use the code that we know after that. Like, they would have some kind of protocols that they would go through on both sides because really, I don’t think any sane person on the planet wants to destroy all life on the planet.
Mike Primeau [00:16:09]:
Definitely. Yeah. And that’s a really good point to verify, like, you know, hey, we got this video that’s saying it’s you. We’re concerned it might be a, b, c, or d country that’s, you know, trying to spark something between us. And, you know, before we both spend our time and resources on this, we just wanna verify it, you know. And there should be that cooperation, definitely. There should.
Terry Brock [00:16:27]:
Yeah. I remember, few years ago, several years ago, I had the opportunity of I was over in Moscow. I speak a very little Russian. I think, you know, I studied some of that in under Sky School and had a chance to meet some people, got a chance to know them. They invited me over for a very big dinner in their home, just the family. And, it was interesting. There was a knock at the door. I wonder who I wonder who that is.
Terry Brock [00:16:47]:
After we were there about 10, 15 minutes or so, in walked a gentleman. They were speaking in Russian to each other. He came over to me, and he said in perfect English, American accent, you must be Terry. I’m like what is this? What’s going on? He introduced himself as the man’s brother who was here. Come to find out he was a KGB general. And I
Mike Primeau [00:17:08]:
think we
Terry Brock [00:17:09]:
come again. We had a chance to get to know each other and I thought this is really interesting, talking to each other. But the reason I mentioned this is he was talking about when they were at the crewman missile crisis. He happened to be right in washington at the soviet embassy during that time. And when he told me, I said, well, god, I’m concerned because for us, that was a big deal thing. What what was it that caused it to stop? When what would stop it? He said, well, it was when your president sent his brother to our embassy. I said, what? I’ve never heard that. What do you mean? Yes.
Terry Brock [00:17:39]:
When he sent Bobby Kennedy to the Soviet embassy, that means he was on Soviet soil. We could have done whatever. And they told Khrushchev, who was the premier at that time, the Americans really don’t want to do it. They’re gonna do this and this. And come to find out, that actually was classified information, and eventually it was declassified That soviet kgb general was telling me the truth. And so I say that to say we’ve had the communication back and forth during that Cuban Missile Crisis. We could have gone into nuclear war. But fortunately, they realized, okay, we need to have this communication.
Terry Brock [00:18:11]:
We need to stay in touch. So So it sounds to me like we’ve got some some good opportunities and what you’re saying really is good. I got one other thing I wanna share with you and with our viewers and our listeners right now, something that I think you’ll find interesting. Earlier today, I was doing some, fiddling around with, perplexity and with Claude. Perplexity is a tool that’s great. I just, yes, yesterday, went into the paid version on that, and I wanna show you a little scenario that I’ve got. Okay. Okay.
Terry Brock [00:18:36]:
I’m looking at a prompt that I put in here just a little while ago. And, Mike, here’s the prompt that I put in. Now this is going to Claude 3, which by many are saying it’s ahead of chat GPT. It’s ahead of, complexity. Really, really good, really accurate. And I said, is this picture of 2 human beings or is it AI generated? How can you tell either way? Should’ve put a question mark there. What are the traits and signals that let you know what this picture really is? And so I’m gonna click on this right here. And is that picture coming through for you, Mike?
Mike Primeau [00:19:07]:
Yeah. Yeah. I can see it.
Terry Brock [00:19:08]:
If I told you that’s, Mike Brown and that’s Sally Smith, they live in Warren, Michigan, not too far from you there in the Detroit area. And, they wanna meet with you next Thursday at 3. That’s what they look like. That would seem like those are real people. Right?
Mike Primeau [00:19:24]:
Yeah. Absolutely.
Terry Brock [00:19:24]:
Yeah. And it seems like it. So here, I’m asking the the, service. Let’s say, okay. Hey, Claude 3. Tell me. What do you think? Well, here’s the answer that it gave me. It said the image appears to detect 2 real human beings, not an AI generated image.
Terry Brock [00:19:38]:
There are several treats and signals that suggest it is a photograph of actual people. And so it went on here and said, oh, the level of detail and realism and the facial features, clothing textures and background environment is extremely high naturalistic. The lighting structures and shadows and reflections of the people’s faces and clothing look physically accurate, the body language and no obvious artifact. In other words, plenty of reason to say, yeah, these are real people. And, Mike, you can go meet them next Thursday in Warren, Michigan, etcetera, etcetera. Well, I came back and I said, well, this is actually AI generated and is not real human beings. If you know it is not real human beings now, what could have led you to believe inaccurately that it is a picture of 2 real humans? I thought this is where we’re getting into the nitty gritty. And this is where Claude 3 comes in, because it goes another step several steps beyond what we’re seeing in other tools.
Terry Brock [00:20:31]:
It said, you’re correct. Upon further investigation and your clarification that this is an AI generated image rather than a photograph of real people, I can now identify certain subtle traits that could have misled my initial assessment. Number 1, the facial features, while highly detailed, have a slightly smooth or idealized quality that is sometimes seen in AI generated faces. Okay. That’s good to know. And it’s kinda like, okay, it’s feeding this information. It can learn it. The lighting and shadows, though realistic overall, have a somewhat flat or unevenly or evenly distributed quality across the scene that can be a hallmark of AI image generation.
Terry Brock [00:21:11]:
And it goes on for some others. And my thought is, okay, the information that it got the second time from my question seems like we need to get to that level first, build that in there, so that when the next time we say, is this the real person? Or in my hypothetical scenario, we hope never happens, is this really Putin saying I’m gonna drop, you know, immediately bomb, then it can go in and can say, wait a minute. This is not Vladimir Putin because voice is different, or the facial context is a little different, or the lighting is different, or something like that. It would seem like that would be some of the ways that we can build in a little more safety, but you’re the expert on this. What are your thoughts on that?
Mike Primeau [00:21:49]:
So in order for something to be courtroom admissible, we have to be able to, ensure that it’s based on a novel forensic science. So, you know, what is the methodology that that person is going to be talking about? And it was it was actually, interesting timing. I I received a, a ruling for a fry hearing that was done on another case yesterday, where an AI enhancement tool was used by a video professional. They claimed to be a forensics video professional, but it was later ruled that they were more like a videographer as opposed to someone that’s forensically qualified and trained that was using a specific tool that was an AI enhancement methodology. And the court ruled that it was inadmissible and it can’t be used, you know, the the exhibits that they were offering and things like that because at its core, we we can’t account that what it’s doing was accurately being done. So we can enhance something. We can make it bigger. We can make it prettier.
Mike Primeau [00:22:46]:
That doesn’t necessarily mean that the process that was, let’s say, enlarging that picture was accurately taking those pixels and introducing those things into the recording in an accurate, reliable way. So with respect to, like, authentication, for example, in that model, you know, the chatbot doesn’t really say what it’s doing. It’s just saying it’s visually looking at the photograph. But there are specific tests that are considered best practices to examine and go, yes, you know, we’ve examined this thing and this is the result. So it it almost looks like it’s also doing it on a global level without an understanding of the technology. So a big component of the forensic process is to understand how that technology works. So when you’re producing something like a JPEG picture, that JPEG algorithm, the way that it’s creating all of those pixels and and packaging that file has certain clues it leaves behind to be able to verify that it’s working the way that you know, in an authentic manner when it’s capturing that photograph as opposed to if it’s being used to do something in Photoshop. And I’m not I’m not saying that Photoshop is a bad tool to use and it’s only used for bad ways.
Mike Primeau [00:23:50]:
We use it in forensics as well, within the appropriate means, you know, considered best practices. So it it for starters, we would have to know exactly what those algorithms from the chatbot are doing. What is it examining? Is it just looking through the picture and saying, yeah, this looks like people, but, you know, is it comparing that with a known database of people? Or is it just assuming based on what you’ve fed it that those are people? And then it goes back and looks at it again and goes, oh, yeah. I see what you’re saying, Terry. That makes sense. So it doesn’t appear to be very thorough as opposed to, you know, understanding, hey. These are the tests I ran, Terry. These are the results.
Mike Primeau [00:24:29]:
You make the decision.
Terry Brock [00:24:30]:
Yep. Exactly. We can build a lot of that in there. Well, Mike, you have helped us a lot today. You scared us a little bit too, but that’s alright. Sometimes we need that to realize the reality.
Mike Primeau [00:24:40]:
Yeah. Well, I I hope it was a good kind of scared and I I appreciate the opportunity to to share with you. Thanks for having me.
Terry Brock [00:24:46]:
Absolutely. And if someone says, hey, we wanna find out about that or we might need some forensics work. How would someone get in touch with you? What’s the best way to do that?
Mike Primeau [00:24:54]:
Yeah. So we’ve got, contact forms on our website, primoforensics dot com.
Terry Brock [00:24:58]:
And please spell that for those that might not speak English as a native language. How do you spell that? Absolutely.
Mike Primeau [00:25:02]:
P as in Paul, r as in Robert, I as in indigo, m as in Mary, e as in Edward, a as in alpha, u as in underscore, forensics, forensi forensics.com. And, yeah, you can get in touch with us via phone, email, And then a great resource to learn more about the best practices within digital forensics is swigde, that’s s w g de.org. It’s a great resource for information.
Terry Brock [00:25:35]:
Excellent. Well, Mike, you’ve helped us a lot in this area. Very, very important and we appreciate you being with us. Thank you, sir.
Mike Primeau [00:25:41]:
Thanks, Terry.
Â