Skip to content

Online Safety Act Risk Assessment

Details
Service name cr8r.gg
Service type User-to-user (U2U) service
Status Draft
Completion date 2025-03-10
Next review/update date 2026-03-01
Reason for review Regular review
Completed by Coinneach Wulson
Named person responsible for the risk assessment XXX
Approved by (governance and accountability channels) XXX

Scope Check

  1. Does your online service have links with the UK?

    Yes

  2. Do you provide a “user-to-user” service?

    Yes

  3. Do you provide a search service?

    Yes (you can search content on other federated instances)

  4. Does your online service publish or display provider pornographic content?

    No, we don’t publish/display any pornographic content

  5. Do any exemptions apply to the user-generated content on your online service? Select all that apply

  • Yes, users can only communicate by email, SMS, MMS and/or one-to-one live aural communications; or
  • Yes, users can only interact with content generated by my business
  • No, my service is not limited to these types of content
  1. Do any exemptions apply to your online service? Select all that apply
  • Yes, it is an internal business service, including services such as business intranet, content management systems, or customer relationship management systems
  • Yes, it is provided by a public body, such as Parliament, a UK public authority, or foreign government
  • Yes, it is provided by an UK education or childcare provider
  • No, none of the above applies

Step 1: U2U and Search Risk Profiles and Risk Factors

User-to-User Service Risk Profile and Risk Factors
  1. Is your service any of these types?
  • Social media service
  • Messaging service
  • Gaming service
  • Adult service
  • Discussion forum or chat room
  • Marketplace or listing service
  • File-sharing or file storage service
  • None of the above
  1. Do child users access some or all of the service?

Yes. We do not actually have this information, so we have to presume that children might access the service.

  1. Does your service include any of these user identification functionalities?
  • User profiles
  • Anonymous user profiles or users without accounts
  • None of the above

Note: Posted content is available for users without accounts, but there is no way to identify these users.

  1. Does your service include any of these user networking functionalities?
  • Users can connect with other users
  • Users can form closed groups or send group messages
  • None of the above
  1. Does your service include any of these user communication functionalities?
  • Livestreaming (either open or closed channels)
  • Direct messaging (including ephemeral direct messaging)
  • Encrypted messaging
  • Commenting on content
  • Posting or sending images or videos (either open or closed channels)
  • Posting or sending location information
  • Re-posting or forwarding content
  • None of the above

Note: Users could post their own location, but this is not a feature that the service provides directly.

  1. Does your service allow users to post goods and services for sale?

    Yes

    Note: This is not a marketplace, but users may share links to their own created goods that they may be purchaseable.

  2. Does your service include any of the following functionalities that allow users to find or encounter content? Tick all that apply.

  • Searching for user-generated content
  • Hyperlinking
  • None of the above
  1. Does your service use content or network recommender systems?

Yes

_Note: occasionally users may be shown a list of accounts that may be of interested based on others users on the instance.

Search Service Risk Profile and Risk Factors
  1. Is your service any of the following service types?
  • General search service (including downstream general search service)
  • Vertical search service
  1. Do child users access your service?

Yes

  1. Does your service have any of the following functionalities?
  • Provide users with search predictions or suggestions
  • Allow users to search for photographs, videos or visual images

Step 2: U2U and Search: Assess the risk of harm

Risk levels and evidence
Terrorism
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Child sexual exploitation and abuse (CSEA) offences
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Grooming (child sexual exploitation and abuse)
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Image-based child sexual abuse material
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Child sexual abuse material URLs
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Hate
Risk level Low
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Harassment, stalking, threats and abuse offences
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Controlling or coercive behaviour
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Intimate image abuse
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Extreme pornography offence
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Sexual exploitation of adults
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Human trafficking
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Unlawful immigration
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Fraud and financial services offences
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Proceeds of crime
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Drugs and psychoactive substances
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Drugs and psychoactive substances
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Firearms, knives and other weapons
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Encouraging or assisting suicide (or attempted suicide)
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Foreign interference offence
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.
Animal cruelty
Risk level Negligible
Risk factors considered Unlikely on our small Mastodon/Fediverse instance.
Additional characteristics considered None
Existing controls considered Active Moderation Team, Reporting Features
Evidence Lack of occurrences of the stated type of content on our instance.

[Note]

We are a small yet international instance with an active moderation team. Our community guidelines ensures that we take all reasonable measures to minimise our users exposure of all of these risk factors.

Step 3: U2U and Search: Decide measures, implement and record

How many monthly active UK users does your service have?

Our instance has 313 active users in the last 30 days, which includes non-UK users.

We estimate there are (at the most) 100 UK users.

ICU A2: Individual accountable for illegal content safety duties and reporting and complaints duties
Status Pending
Date measure takes/took effect N/A
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU C1: Content moderation function to review and assess suspected illegal content
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU C2: Having a content moderation function that allows for the swift take down of illegal content
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D1: Enabling complaints
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D2: Having easy to find, easy to access and easy to use complaints systems and processes
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICS D6: Appropriate action: Complaints about suspected illegal content
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D7: Appropriate action for relevant complaints about suspected illegal content
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D8: Appropriate action for relevant complaints which are appeals – determination (services that are neither large general nor multi-risk)
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D9: Appropriate action for relevant complaints which are appeals – determination
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D10: Appropriate action for relevant complaints which are appeals – action following determination
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D11: Appropriate action for relevant complaints about proactive technology, which are not appeals
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D12: Appropriate action for all other relevant complaints
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU D13: Exception: manifestly unfounded complaints
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU G1: Terms of service: substance (all services)
Status Implemented
Date measure takes/took effect 2025-03-13
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU G3: Terms of service: clarity and accessibility
Status Implemented
Date measure takes/took effect 2025-03-13
Relevant codes Child sexual exploitation and abuse, Terrorism, Other duties
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023
ICU H1: Removing accounts of proscribed organisations
Status Implemented
Date measure takes/took effect 2022-11-05
Relevant codes Terrorism
Relevant duties Section 10(2), (3), and (5) to (9). Section 20(2). Section 21(2) and (3) Online Safety Act 2023

Step 4: U2U and Search: Report, review, and update risk assessments

  • Date of next annual risk assessment:

    April 1st

  • Confirmation findings of the illegal content risk assessment have been reported, and recorded:

    Yes, here.

  • Date the findings of the illegal content risk assessment were reported, and recorded:

    2025-03-12

Children's Access Assessment
Stage 1
  • Is it possible for children to access the service or part of it?

    Yes

[Note]

the instance is public, and "accessing the service" includes visiting the site without logging in or posting. We don't have any age verification tools, or limits on the ages of people viewing the forum.

Stage 2
  • Are there a significant number of children who are users of the service?

    No

[Note]

We have no evidence of children accessing our service. It is impossible to know whether visitors are aged under 18 or not, all we have is an IP address.

  • Is the service of a kind likely to attract a significant number of children?

    No

[Note]

While we are a generalistic instance, we have no evidence of content specifically made for children.

  • Result

    No need to carry out a Children's Risk Assessment.