In 2016, cybersecurity expert Patrick Wardle heard a deeply disturbing story: cybercriminals were using malware to covertly spy on people through their MacOS webcams and microphones.In one particularly disturbing case, a hacker used a “Drosophila” Hijacking laptop webcams in order to spy on children.
Wardle has experience discovering such programs. Before entering the private sector, he worked as a malware analyst at the National Security Agency, where he analyzed code used to target Department of Defense computer systems.Experienced in playing digital defense, Wardle decided to take action against the spyware threat: he created supervise, a MacOS tool that lets you monitor your webcam and microphone for signs of malware manipulation. “It was really popular, everyone loved it,” he said of the tool, which he released for free through his IT nonprofit objective – look.
However, a few years later, Wardle was analyzing some suspicious code for a client and found something odd in a tool downloaded to the client’s own device. The tool was created by a large company, but offers similar functionality to OverSight, including the ability to monitor MacOS webcams and microphones. Filtering the program, Wardle found familiar code. too familiar. His entire OverSight algorithm—including the bugs he failed to remove—was contained in another program. A developer reverse-engineered his tool, stole his work, and reused it for a different but nearly identical product.
“I like to use the analogy of plagiarism: someone copied what you wrote, and they copied your spelling and grammar mistakes,” Wardle said. “I always say there are ways to skin the proverbial cat, but it’s like blatant copyright [infringement]. “
The developers were taken aback. He immediately contacted the company and tried to alert them to the fact that the developer had hijacked his code. Unfortunately, Wardle said, this isn’t the last time he’s found out a company has chosen him for his job. Over the next few years, he will find evidence that two other large companies have used his algorithms for their own products.
This week, Wardle introduced his black hat, the annual cybersecurity conference in Las Vegas.With Johns Hopkins professor Tom McGuire, Wardle proved How reverse engineering (the process of taking a program apart and rebuilding it) reveals evidence of this type of theft.
The developer refused to identify the company that stole his code. He said this was not revenge. He said it was about identifying “systemic issues” affecting the “cybersecurity community”. To that end, Wardle outlined some of the lessons he learned while trying to inform companies about the theft in his presentation this week.
“You contact these companies and say, ‘Hey, you guys, you basically stole something from me. You reverse-engineered my tool and re-implemented the algorithm — which is very legally…. .. er, grey. In the EU, there is a directive that if you…[do that] This is illegal. But it’s just bad optics. I run a non-profit organization. You’re essentially stealing information from a nonprofit, putting it in your commercial code, and making money off it. It doesn’t look good,” he said with a smile.
The responses to Wardle were often mixed. “It depends on the company,” he said. “Something was great: I got an email from the CEO acknowledging this and asking, ‘What can we fix?’ Amazing…[With] Others, it’s a three-week internal survey, and then they come back and tell you to go hiking because they don’t see any internal consistency. In these cases, Wardle had to provide more evidence of what happened.
Why does this kind of thing even happen in the first place? Wardle said his views have changed over time. “I’m starting to think these are evil corporations to suppress indie developers. But in every case, it’s basically the misguided or naive developers who are responsible [finding a way to] Monitor microphones and webcams…then he or she would reverse engineer my tool and steal the algorithm…and then no one in the company would ask, ‘Hey, where did you get this? ‘”
In all three cases, after Wardle presented his case to a company, executives eventually admitted wrongdoing and offered to rectify the situation. However, to effectively make his case, Wardle often had to show them evidence. He said he had to use their own closed-source software and employ reverse engineering to understand how their code worked and prove it was similar to his own. To support his case, Wardle also worked with the nonprofit Electronic Frontier Foundation (EFF), which provides pro bono legal services to independent security researchers. “Having them on my side gave me a lot of credibility,” he said, hinting that other developers had followed similar tactics.
“I’m in a great position because I’m working with the EFF and I have a huge audience in the community because I’ve been doing this for a long time,” Wardle said. “But if this happened to me, other developers probably didn’t [the same standing]…in which case the company might tell them to raise rates. So what I really want to do is talk about this and say, ‘Hey, this is not good. ‘”
As for how common the practice of algorithmic theft is, Wardle thinks it’s fairly common. “I believe it’s a systemic problem because when I started looking, I found not only one, but several. And they [the companies] are completely irrelevant. ”
“One of the points I’m trying to make is that if you’re a company, you really need to educate your employees or developers [not to steal]. If you do this, it will expose your entire organization to legal risk. And, again, the optics look really bad,” he said.