Online industry unveils plan to regulate internet safety | Panda Anku

Associations representing companies the likes of Meta, Google and Telstra have released industry codes of conduct they say will make the internet safer just days after being threatened with legal action by online regulators.

The codes, developed through industry consultation over a year, cover almost every area online, from the manufacture and maintenance of internet-connected devices to internet service providers, social media and encrypted messaging.

The codes require providers to notify authorities of the release of the most dangerous content when they believe it poses a serious and imminent threat and have systems in place to forward reports of the release of encrypted communications to authorities.

A “security feature” in the code requires providers of encrypted messaging services to register users with a phone number, email address, or other identifier for the service.

Businesses that don’t comply with the codes face penalties and content bans from Australia’s online regulator, which last year was given greater investigative and enforcement powers by Parliament.

Image credit: Twin Design / Shutterstock.com

The eSafety Commissioner last year directed Australia’s online industry to develop codes dealing with the treatment of Class 1 and Class 2 material online – internet content that would either be refused classification or classified as R18+.

This includes things like online material that depicts or promotes the sexual exploitation of children, terrorism, extreme crime and violence, crime and violence, drug use and pornography.

The Commissioner’s power to do this and then to investigate and enforce compliance with the Codes stems from Australia’s Online Safety Act, passed with bipartisan support last year.

On Thursday, groups including DIGI, which represents big tech and social media companies, the Australian Mobile Telecommunications Association and the video game sector’s Interactive Games and Entertainment Association released their version of codes for the most harmful Class 1 material.

The eight codes cover various online services such as social media, gaming, search messaging and app distribution, as well as internet transportation services and the manufacture and supply of “any equipment that connects to the internet”.

Under the onlinesafety.org.au banner, the six industry associations now deliberate publicly on the Codes, which are then submitted to the eSafety Commissioner for registration.

The draft codes contain minimum conformance levels. For example, under the Social Media Services Code, all social media companies are required to post child sexual exploitation material (CSEM) or pro-terrorist material that they believe poses a serious and imminent threat to the safety of Australians. within 24 hours or “as soon as reasonably practicable”.

For electronic service codes, which include messaging and gaming, each business that qualifies as a business-related electronic service—businesses that provide services to a wide range of users—must have agreements with users regarding the distribution of material.

Closed-loop communication and encrypted messaging services must require a user to register for the service using a phone number, email address, or “Other Identifier.”

Under this code, large and encrypted service providers must notify authorities of child exploitation and pro-terrorist material identified on their service, also within 24 hours or “as soon as possible.”

All ISPs must ensure that children cannot obtain Internet transport service without parental or guardian consent, instruct their customers not to produce online material that violates Australian law and work with the eSafety Officer to establish a protocol to block of children’s sexual activities to develop recovery materials in accordance with the relevant code.

Group spokesmen said the industry codes would improve the security of the online environment.

“The draft code strengthens security safeguards across the online industry to protect Australians from certain harmful content and makes existing protections more consistent and transparent,” said Sunita Bose, DIGI Managing Director.

John Stanton, CEO of fellow technology group Communications Alliance, added: “The new codes are there to protect all Australians. It is important that we hear from a wide range of stakeholders and internet users whether the draft codes work for them.”

The industry groups behind the Codes are the Australian Mobile Telecommunications Association (AMTA), BSA The Software Alliance (BSA), Communications Alliance (CA), Consumer Electronics Suppliers Association (CESA), Digital Industry Group Inc. (DIGI) and Interactive Games and Entertainment Association (IGEA).

Public consultations on their draft code are open until October 2nd. Further work on another code covering Class 2 adult content is planned to begin as the next stage once the Class 1 code is registered.

On Tuesday, Australia’s current eSafety Commissioner Julie Inman Grant ordered the tech giants behind some of the biggest social media and messaging apps to detail what they are doing to tackle the child exploitation material. It did so by issuing legal notices in line with the new government’s basic online safety expectations, a key part of the Online Safety Act 2021.

Correction: Codes for Class 2 material are not necessarily going to be released this year as stated in a previous version of this article.

Do you know more? Contact James Riley via email.

Leave a Comment