Please consider downloading the latest version of Internet Explorer
to experience this site as intended.
Tools Search Main Menu

Alumni Gazette

Denying Extremists a Powerful Tool Hany Farid ’88 has developed a means to root out terrorist propaganda online. But will companies like Google and Facebook use it? By David Silverberg
faridDIGITAL DETECTIVE: Farid and a Dartmouth student compare photographic and computer-generated images. In addition to his work combatting online child pornography and political extremists, Farid is founder and chief technology officer of a photo authentication service, Fourandsix Technologies. (Photo: Eli Burakian/Dartmouth College)

Hany Farid ’88 wants to clean up the Internet. The chair of Dartmouth’s computer science department, he’s a leader in the field of digital forensics. In the past several years, he has played a lead role in creating programs to identify and root out two of the worst online scourges: child pornography and extremist political content.

“Digital forensics is an exciting field, especially since you can have an impact on the real world,” says Farid, “When you look around, you see how wide the net is spreading. But with that comes new challenges and problems.”

His hallmark project is PhotoDNA, a program he created in partnership with Microsoft Research in 2008. PhotoDNA detects child pornography as the images are posted online. It works by matching new content posted on social media outlets to millions of pornographic images of children collected and maintained by the National Center for Missing and Exploited Children.

Now Farid is taking the same model of PhotoDNA and doubling down: he wants to find and root out extremist content that supports real-world violence and terrorism.

“If we want to really prevent extremist content from getting online in the first place, we need to develop a technology to process billions of images and videos daily,” he says.

Farid has created such a technology. It works by establishing a central database of extremist content and distributing unique fingerprints of each photo, video, and audio file to the platforms that want to filter this content. If a Twitter user, for example, uploads a video showing an execution of a soldier, this system would recognize that content as violating the outlet’s terms of service and the account would be automatically quarantined. An investigation would determine whether the quarantine was appropriate or a “false positive.” Law enforcement could be called in, when necessary, to further investigate the user’s account.

Farid has partnered with the Counter Extremism Project, a nonprofit organization led by former officials from the Department of State and Homeland Security. He says the technology’s adoption should be a “no-brainer” for social media outlets. But so far, the project has faced resistance from the leaders of Facebook, Twitter, and other outlets who argue that identifying extremist content is more difficult, presenting more gray areas, than child pornography. In a February 2016 blog post, Twitter laid out its official position: “As many experts and other companies have noted, there is no ‘magic algorithm’ for identifying terrorist content on the Internet, so global online platforms are forced to make challenging judgment calls based on very limited information and guidance.”

Farid disputes that argument. “Companies should take responsibility for the misuse of their platforms, from trafficking of underage prostitutes, to selling illegal weapons, to promoting and radicalizing extremists who then commit heinous crimes,” he says.

Media outlets including the Wall Street Journal, Atlantic magazine, and the PBS NewsHour have called on Farid in pieces or segments exploring the debate. Although concerns about privacy are widespread, Farid is not alone in his criticism of social media companies. Last August, a panel in the British Parliament issued a report charging that Facebook, Twitter, and Google are not doing enough to prevent their networks from becoming recruitment tools for extremist groups.

Steve Burgess, president of the digital forensics firm Burgess Consulting and Forensics, admires Farid’s dedication to projects that, according to Burgess, aren’t common in the field. “It’s great that such a tool has come into existence,” he says of Farid’s antiterrorism technology.

After studying computer science and applied mathematics at Rochester, Farid earned a PhD in computer science from the University of Pennsylvania and was a postdoctoral fellow at MIT. These days, he stays busy even beyond his commitment to teaching and research. In 2014, he cofounded the photo authentication service Fourandsix Technologies, where he remains as chief technology officer.

Enmeshed in the seedy underground of extremist online propaganda, Farid says he appreciates the chance to get away from it all. He lives with his wife, Emily, on several acres of land in Vermont. He takes his mind off technology by cutting wood to prepare for harsh winters and, he adds, “bumbling through the woods on my tractor.

“At the end of the day, I’m definitely away from screens.”