
Search engines such as Google and Microsoft’s Bing must implement age assurance checks for logged-in users in ‘no later than six months’. Image: Shutterstock
Australians using search engines while logged in to accounts from the likes of Google and Microsoft will have their age checked by the end of 2025, under a new online safety code co-developed by technology companies and registered by the eSafety Commissioner.
Search engines operating in Australia will need to implement age assurance technologies for logged-in users in “no later than six months”, under new rules published on Monday.
While only logged-in users will be required to have their age checked, many Australians typically surf the web while logged into accounts from Google, which dominates Australia’s search market and also runs Gmail and YouTube; and Microsoft, which runs the Bing search engine and email platform Outlook.
If a search engine’s age assurance systems believe a signed-in user is “likely to be an Australian child” under the age of 18, they will need to set safety tools such as “safe search” functions at their highest setting by default to filter out pornography and high impact violence, including in advertising.
Currently, Australians must be at least 13 years of age to manage their own Google or Microsoft account.
The age assurance technologies used are expected to be similar to those currently being considered for Australia’s under-16s social media ban, which is expected to begin in December.
Age assurance methods can include age verification systems, which use government documents or ID; age estimation systems, which typically use biometrics; and age inference systems, which use data about online activity or accounts to infer age.
Search engines will not be required to implement age assurance measures for users who are not logged in to their services, according to the new rules.
“Internet search engine services are designed for general public use, with or without an account,” the code states.
However, users who are not logged in should also expect “default blurring of images of online pornography and high-impact violence material detected in search results”.
Other compliance measures in the code which search providers must abide by include improving search and age assurance technologies over time, preventing autocomplete predictions “that are sexually explicit or violent”, and responding to searches about eating disorders or self-harm with crisis prevention information.
Google and Microsoft were contacted for comment.
Earlier this year Google said it would begin using artificial intelligence to estimate a users’ ages, beginning with tests in the United States, while Microsoft has previously stated it has explored age assurance methods while considering potential impacts for user safety and privacy.
Many Australians surf the web while signed in to accounts for Google or Microsoft services. Image: Shutterstock
Changes ‘designed to protect’ Australian kids
The new rules for search engine operators were “designed to protect” Australian children, according to the code.
Drafting of the code was co-led by Digital Industry Group Inc. (DIGI), which was contacted for comment as it counts Google, Microsoft, and Yahoo among its members.
eSafety Commissioner Julie Inman Grant said she had registered three new codes submitted by the online industry, which covered harmful content on search engines, enterprise hosting services, and internet carriage services such as telecommunication firms.
The codes had been in the works since July 2024 and failure to comply with them could result in civil penalties of up to $49.5 million per breach, her office said.
The Commissioner said she had sought extra safety commitments from the industry on six outstanding codes, which covered the likes of app stores, device manufacturers, social media, and messaging services.
“It’s critical to ensure the layered safety approach which also places responsibility and accountability at critical chokepoints in the tech stack including the app stores and at the device level, the physical gateways to the internet where kids sign-up and first declare their ages,” Inman Grant said.
Google is the most dominant search engine in Australia, and is estimated to receive more than 90 per cent of search queries. Image: Shutterstock
Push to protect children who use AI chatbots
Members of the technology industry had also been asked to use the remaining six codes to strengthen their protections against generative AI chatbots engaging in harmful behaviours with children, Inman Grant said.
“We are already receiving anecdotal reports from school nurses, that kids as young as 10 are spending up to five hours a day with AI chatbots, at times engaging in sexualised conversations and being directed by the chatbots to engage in harmful sexual acts or behaviours,” she said.
Inman Grant said she would consider the changes proposed by the industry and would aim to make her final determination on the six outstanding codes by the end of July.
“If I am not satisfied these industry codes meet appropriate community safeguards, I will move to developing mandatory standards,” she said.