Raising votes in the US Congress for the regulation of social media is nothing new.

The tech giants have been called in for questioning time and time again, but despite that not much has happened.

This time, however, it could look different, after the whistleblower Frances Haugen published internal Facebook documents.

Unlike before, there now seems to be political agreement that legislation is required, reports CNBC.

- If you closed your eyes, you would not have been able to decide who was a Democrat, who was a Republican or where they came from.

The entire country has seen the damage that can be linked to Facebook and Instagram, said Richard Blumenthal, chairman of the Senate committee, about the hearings in Congress.

"Can not legislate on culture"

It is clear that the two political camps want to see legislation, but the question is whether they can agree on what one might look like.

Given that so much focus has been on the young people's mood in the interrogations, the legislation can become cumbersome, as it is not necessarily unauthorized posts that are behind the problem.

- You can not legislate about the culture in social media, but you can legislate about what kind of material the algorithms give to certain people.

If the system feels that I am unsure about my weight, what kind of material and advertising is directed at me?

Companies can be held responsible for that part, says Professor Dan Svantesson, who researches law and the Internet at Bond University in Australia.

Can get rid of the discharge

Another possible alternative is to change what is called "Section 230", a law that absolves platform owners from responsibility for the content that users publish.

Several countries have introduced systems where companies can be fined if they do not delete illegal content, but time has shown that such systems often lead to other problems.

- There is a risk that companies will take the easy way out, which in many cases is to remove more than is required.

We have seen examples of this several times.

For example, that information about breast cancer was removed due to images that were considered to violate rules on nudity, says Svantesson.

There are also other risks in letting the company take responsibility for the users' content, not least considering that laws enacted in individual countries can in practice also come into force globally.

Hear Dan Svantesson explain in the clip above.