This report follows the committee's second interim report that was tabled in the House in October. That report examined how the decision of Meta to abandon deals under the News Media Bargaining Code could influence the provision and consumption of public interest journalism in Australia and give rise to mis and disinformation. It also found that while the code was established in good faith, its implementation has revealed significant shortcomings. Accordingly, the committee made 11 considered recommendations to improve the effectiveness of the code and the sustainability of public interest journalism and digital media in Australia.
This third and final report looks closely at the influences and impacts of social media on Australian society. The committee received 220 submissions throughout the inquiry, conducted 10 public hearings and received additional written responses to many questions on notice. We heard from experts, academics, bureaucrats, big tech companies, advocates, grieving families and young Australians who have grown up with digital technology all their lives. Social media users in Australia are some of the most active in the world, with approximately 81 per cent of all Australians reporting that they were regular users of social media in 2023.
While acknowledging that social media is a huge part of everyday life for most Australians, this final report examines social media in its entirety—the good, the bad and the ugly. The committee heard that social media can be addictive and disruptive to offline activities. It can impact on sleep, lead to mis- and disinformation being shared and consumed and poor mental health and, at its worst, expose vulnerable users to online predators, unhealthy expectations around body image, bullying and sextortion. In some tragic cases, it can lead people to extremely dangerous self-harming behaviours, such as eating disorders and suicidality. We heard that much more needs to be done to protect Australians from these harms, particularly by social media companies themselves, who make money by keeping their users engaged. Social media is never free, because users pay for it with their attention. As one young witness observed, if the product is free then you are the product. This statement reflects the true nature of social media platforms. They provide a free space for community and connection, but in exchange the user is beholden to the business decisions of big tech companies, whose focus is squarely on increasing revenue without consideration for the health and wellbeing of their customer.
This report puts big tech on notice. Social media companies are not immune to the need to have a social licence to operate here in Australia. Participants to the inquiry also painted a picture of how the relationship Australians have with social media is complex and forever evolving. We were told that many Australians, particularly young Australians, enjoyed using social media and, for the most part, they didn't want their access to it to be restricted. As one young person said, social media isn't just a platform; it's a lifeline for connection, information and community for many young people. But we also heard that social media companies use opaque algorithms that keep users scrolling, feeding us what they think we want to see, even if it is harmful. Young people recognise these harms and the vulnerabilities of some users, but they wanted to better protect themselves from harms and have more control over their social media experience to better improve their own health and wellbeing. They want to be able to alter, set or indeed turn off their personal algorithms and recommended systems. They want greater control over the content and paid advertising they see and when they see it, and they want to be active participants in co-designing policy to help improve the safety and accountability of online platforms.
The committee also heard from mental health organisations who told us how social media is used to support people's mental health and wellbeing, with many Australians accessing mental health resources online. But they too agreed that some harms were so extreme that interventions were required and much more should be done to protect users from online harms. Academics and experts spoke to the many different facets of the social media environment, including its harms, noting that there is no one silver bullet that is going to solve this problem. But they were adamant that this shouldn't stop us from taking immediate action to better protect Australian users. We heard that, while legislating an age limit might not be the perfect solution and should certainly not be the only solution, it would provide important breathing space for the implementation of long-term sustainable digital reforms.
Finally, the committee also heard from parents regarding the horrific online harm experienced by their children, with some parents drawing a direct link between the influence and impact of social media use and their child's mental health and wellbeing. We heard about young users suffering eating disorders who are continuously shown social media content that is harmful to their recovery, and we heard about vulnerable young people who are endlessly bullied and sextorted via social media, leading to them taking their own lives.
Parents presenting evidence to the committee pleaded for an increase in the minimum age for children to access social media, noting the current restrictions had failed. Without a legislated minimum age to access social media, parents said, they felt unsupported in their efforts to protect their kids from social harms. They wanted to be able to tell their kids, 'No. It's the law. You have to wait until you're old enough.' And they worried that social media companies would take years to implement reforms that made them responsible for their platforms and for preventing online harms.
While everyone agrees that social media is part of everyday life and will remain so, social media platforms and online services have a key responsibility for the safety of their users. This report makes recommendations for immediate and long-term government action, but it also puts the responsibility back onto big tech, who absolutely must do better. It contains 12 well-defined recommendations that go to the heart of the problem: keeping Australian users safe. The report recommendations include greater enforceability of laws to bring digital platforms under Australian jurisdiction; support for a single and overarching statutory duty of care for digital platforms to ensure Australian users, particularly children, are safe online; effective mandatory data access for independent researchers and public interest organisations, coupled with a rigorous auditing process by appropriate regulators; measures to enable users to have greater control over the content they see by having the ability to alter, reset or turn off their personal algorithms and recommended systems; greater protections for users' personal information; inclusion of young Australians in the co-design process for the regulation of social media; research and data collection provisions that enable evidence based policy development; ongoing education to improve digital competency and online safety skills; built-in safety-by-design principles for current and future platform technology; a transparent complaints mechanism that incorporates a right-of-appeal process; and adequate resourcing for the office of the eSafety Commissioner to discharge its ever-evolving functions. Taken together, these recommendations map a pathway forward for social media reforms in Australia.
The committee notes that in the past two weeks the government has announced the introduction of legislation to make 16 the minimum age of access for social media. Other recent measures include legislation to combat the rise of mis- and disinformation and a landmark scams prevention framework which calls for fines of up to $50 million and requires social media platforms, banks and telecommunications companies to protect Australians from online scams. And just last week the government announced that it will be legislating a digital duty of care to place the onus on digital platforms to proactively keep Australians safe and better prevent online harms in the first place. These actions are part of a suite of government reforms that complement each other and incentivise the design of a safer, healthier digital platforms ecosystem.
The committee strongly supports the 12 recommendations in this final report, along with the recommendations of our second interim report. Collectively, these 23 recommendations map a pathway forward for social media reforms in Australia and put big tech on notice, because social media companies are not immune from the need to have a social licence to operate in Australia.
I would like to sincerely thank the secretariat, who have worked hard to meet the committee's deadlines and have been exceptional in providing support to the committee. Thank you to the committee secretary, Gerry McInally; Aysha Osborne; Natasha Rusjakovski; Michael Perks; Aisha Bottrill; and Jamison Eddington. I would also like to thank—I see her sitting at the table—the former chair of this committee, the member for Jagajaga and assistant minister, for her diligent work in the first iteration of this committee. I also want to acknowledge all of the committee members—some of whom I see sitting in the chamber this evening—who have worked productively with me throughout this inquiry. It doesn't mean we haven't had our challenges and differences, but we have landed in a place where there is a good collective will to ensure improved online safety for all Australian users.
My message is simple: the age of unregulated social media is over. Online safety is paramount and social media platforms must take responsibility to ensure fundamental protections are in place. Social media has a social responsibility for the safety of their users, and this report maps out ways in which they can be held to account, ensuring social media is a safe place for all Australians to find connection, community and reliable information.