From Tech to Reality: Deep Truths About Deepfakes

As shared by Steve McNew, an MIT trained blockchain/cryptocurrency expert and senior managing director at FTI Consulting, “Online videos are exploding as a mainstream source of information. Imagine social media and news outlets frantically and perhaps unknowingly sharing altered clips — of police bodycam video, politicians in unsavory situations or world leaders delivering inflammatory speeches — to create an alternate truth. The possibilities for deepfakes to create malicious propaganda and other forms of fraud are significant.”

en flag
nl flag
fi flag
fr flag
de flag
pt flag
ru flag
es flag

Editor’s Note:  With the potential for deepfakes to create malicious propaganda and other forms of fraud becoming increasingly significant in today’s digitally-driven world of communications, understanding the technology behind deepfakes may be beneficial for data and legal discovery professionals as they seek to evaluate and authenticate electronically stored information. In this article extract from Cointelegraph author Sharaz Jagati, several industry experts, to include Steve McNew of FTI Consulting, consider and comment on the deep truths of deepfakes.

Deep Truths of Deepfakes — Tech That Can Fool Anyone

An extract from an article by Sharaz Jagati via Cointelegraph

From a technical standpoint, visual deepfakes are devised through the use of machine learning tools that are able to decode and strip down the images of all the facial expressions related to the two individuals into a matrix consisting of certain key attributes, such as the position of the target’s nose, eyes, and mouth. Additionally, finer details, such as skin texture and facial hair, are given less importance and can be thought of as secondary.

The deconstruction, in general, is performed in such a way that it is mostly always possible to fully recreate the original image of each face from its stripped elements. Additionally, one of the primary aspects of creating a quality deepfake is how well the final image is reconstructed — such that any movements in the face of the imitator are realized in the target’s face as well.

To elaborate on the matter, Matthew Dixon, an assistant professor and researcher at the Illinois Institute of Technology’s Stuart School of Business, told Cointelegraph that both face and voice can be easily reconstructed through certain programs and techniques, adding that:

“Once a person has been digitally cloned it is possible to then generate fake video footage of them saying anything, including speaking words of malicious propaganda on social media. The average social-media follower would be unable to discern that the video was fake.”

Similarly, speaking on the finer aspects of deepfake technology, Vlad Miller, CEO of Ethereum Express — a cross-platform solution that is based on an innovative model with its own blockchain and uses a proof-of-authority consensus protocol — told Cointelegraph that deepfakes are simply a way of synthesizing human images by making use of a machine learning technique called GAN, an algorithm that deploys a combination of two neural networks.

The first generates the image samples, while the second distinguishes the real samples from the fake ones. GAN’s operational utility can be compared to the work of two people, such that the first person is engaged in counterfeiting while the other tries to distinguish the copies from the originals. If the first algorithm offers an obvious fake, the second will immediately determine it, after which the first will improve its work by offering a more realistic image.

Regarding the negative social and political implications that deepfake videos can have on the masses, Steve McNew, an MIT trained blockchain/cryptocurrency expert and senior managing director at FTI Consulting, told Cointelegraph:

“Online videos are exploding as a mainstream source of information. Imagine social media and news outlets frantically and perhaps unknowingly sharing altered clips — of police bodycam video, politicians in unsavory situations or world leaders delivering inflammatory speeches — to create an alternate truth. The possibilities for deepfakes to create malicious propaganda and other forms of fraud are significant.”

Read the complete article at Deep Truths of Deepfakes — Tech That Can Fool Anyone

Additional Reading

Source: ComplexDiscovery

A Matter of Pricing? A Running Update of Semi-Annual eDiscovery Pricing Survey Responses

First administered in December of 2018 and conducted four times during the last two years with 334 individual responses, the semi-annual eDiscovery Pricing Survey highlights pricing on selected collection, processing, and review tasks. The aggregate results of all surveys as shared in the provided comparative charts may be helpful for understanding pricing and its impact on purchasing behavior on selected services over time.



Access the Results Now!

ComplexDiscovery combines original industry research with curated expert articles to create an informational resource that helps legal, business, and information technology professionals better understand the business and practice of data discovery and legal discovery.

All contributions are invested to support the development and distribution of ComplexDiscovery content. Contributors can make as many article contributions as they like, but will not be asked to register and pay until their contribution reaches $5.