It's a reasonably well-made documentary comprised of a series of interview fragments occasionally interspersed with bits of drama that provides an artistic illustration of the point.
They are making the fashionable argument that social media, in its current form, is bad. Nowhere in the course of the documentary do they attempt to define what social media means. It is kind of assumed to mean Facebook, Twitter, YouTube, Instagram, potentially Reddit, plus several less prominent ones. One interviewee, who they give the most time to, started his crusade with Gmail; so there's a half-hearted attempt to throw email in the bag as well.
The argument is uneven. The best and the most convincing part of it is about the addictive design that social media have evolved to keep users engaged: the notifications, the dot-dot-dot someone is typing, the someone has tagged you in a photo, the unexpectedness/novelty of what will come up next in the feed.
At the same time, many points just get glossed over.
Everyone sees their own news feed — well, that's just an extension and amplification of what we observe in the physical world, with people keeping different companies, and reading different newspapers, and watching different TV channels, which may have radically different take on the facts.
Truth. An interviewee says the tired "People have no idea what's true, and now it's a matter of life and death". Fine. But if we don't just let people decide for themselves what is true and whom to trust, who then is going to decide it for them? If voices that are deemed untrue get suppressed, it raises the question of power that modern theorists are so obsessed by. How come that the truth about non-binary genders gets amplified and disseminated, but the truth about flat earth or pizzagate gets suppressed? This evokes all sorts of analogies, with the Inquisition, or with the Soviet government, that never get addressed by such critics of social media.
Then there's a familiar trope about democracy. "What we're seeing is a global assault on democracy." I don't know what they mean by democracy — apparently not direct political activity of a part of the population flared up by social media (by the way, wasn't the Arab Spring, or the Orange Revolution an exercise in "democracy"? wouldn't they be unimaginable without social media?). Perhaps to them democracy means having a common narrative uniting the majority of the country. It is as if they want state propaganda back, because it ensured, to a certain point, some kind of social cohesion.
A blue-haired data scientist speaks: "I like to say that algorithms are opinions embedded in code... and that algorithms are not objective. Algorithms are optimized to some definition of success. So, if you can imagine, if a... if a commercial enterprise builds an algorithm to their definition of success, it's a commercial interest. It's usually profit." Wait, how isn't achieving a given definition of success objective? You have defined criteria for success; you can measure results against this definition; how is that not objective?
Closer to the end though the blue-haired data scientist offers an astute epistemological observation: People talk about AI as if it will know truth. AI's not gonna solve these problems. AI cannot solve the problem of fake news. Google doesn't have the option of saying, "Oh, is this conspiracy? Is this truth?" Because they don't know what truth is. They don't have a... They don't have a proxy for truth that's better than a click.
I also got an impression that most of the interviewees are soft skills people. Project managers, strategists, designers. There were very few, if any, of those who actually get their hands dirty writing code.