That is a real stretch (good sound bite I suppose), but what you suggest would require the phone company to be aware of the content of the calls (monitoring). With advertising, the publisher is well aware of the content. A better comparison would be a TV commercial advertising (with a picture) a young, under aged, prostitute for hire. I personally don’t have a real issue with consenting adults paying for sex (though I have never needed or desired such a service), but there really is a problem with underage sex trafficking. The numbers, to me, are staggering. I, apparently, don’t feel the same way as you about the responsibility of these media professionals contributing to the problem. They can and should do better at setting standards for what is appropriate.
Edit: Come on Rich, you build websites. Would you knowingly build one that advertised young, underaged prostitutes (I am guessing not).
No, I wouldn't. In fact, I used to run a minimally-moderated soapbox site; and when illegal stuff (mainly doxing) showed up, I deleted it (and when appropriate, reported it). The whole purpose of the site was to be an uncensored forum for people to say whatever they wanted, but that didn't include illegal behavior.
But I didn't shut down the whole site because a few people used it for illegal purposes. The fact that some people use a resource for vile purposes shouldn't prevent consenting adults from using it for purposes that are none of anyone's business. When I finally did shut the soapbox site down, it was because the small amount of money it generated didn't justify the time I had to spend reviewing content that the various filters flagged as suspicious.
Now scale that up to a site as big and popular as Backpage was, and it becomes literally impossible to police. But Backpage did try. They had algos in place to flag suspicious content, removed millions of ads in a typical month, and reported hundreds of cases of suspected child exploitation every month to NCMEC (who, by the way, acknowledged that; but also contended that Backpage's efforts were inadequate).
The problem with algos is that eventually, people figure out ways around them. But expecting people who run sites like Backpage or Craigslist to manually review every single posting is categorically ridiculous. It would, in fact, be analogous to the phone company listening in to every conversation to make sure nothing illegal was being discussed. Even if they had the inclination, they don't have the staff.
Even the government doesn't hold itself to the same standards as it expected Backpage to meet. Pretty much everything that's said by anyone in America using any kind of electronic media is monitored -- but it's by machines, not by real people listening in to every phone call and reading every email and text message. It's all done by machines, and it's very imperfect. (Use your favorite search engine to search on
Echelon if you want to know more.)
So why should a private company that runs what is basically a bulletin board be expected to perfectly do that which is impossible for the government, with all its vast resources, to do; and why is that same demand not made of other companies whose business is providing a platform for communication by others, such as the phone company, the Post Office, or the general store operator who has a physical bulletin board in his store for the public to post messages on?
There are limitations to both technology and human review. The fact is that a forum owner can't prevent anything from being posted. All they can do is remove it (and report it, if appropriate) once it is; and even then, only after becoming aware of it.
Backpage tried, albeit imperfectly, to do that. I read somewhere when the case first broke that Backpage reported more trafficking and child exploitation cases than all other entities combined. Does the fact that they weren't perfect justify removing a resource that most people used for perfectly legal purposes?
Rich