LogoLogo
English
English
  • Legal Portal
    • Contact us
    • Annual reports
      • Gender Pay Gap Reports
        • Gender Pay Gap Report 2023-2024
        • Gender Pay Gap Report 2022-2023
        • Gender Pay Gap Report 2021-2022
        • Gender Pay Gap Report 2020-2021
      • Modern Slavery Act Transparency Statement
      • EU Digital Services Act Transparency Reports
        • Jagex EU Digital Services Act (DSA) Transparency Report for the period ending 16 February 2025
    • Partners and third parties
      • Publishing Partners
      • Third-party Partners
      • Third Party Software
    • Policies
      • Privacy Policy
        • Young Person's Privacy Policy
        • Exercising Your Rights
      • Cancellation Policy
      • Unexplained Payments
      • Fan Content Policy
    • Terms
      • Terms & Conditions
        • Young Person's Terms and Conditions
      • Subscription Terms & Conditions
      • End User Licence Agreement
        • Young Person's End User Licence Agreement
      • Prize Draws, Competition and Promotions
      • Credit/Debit Card Terms and Conditions
    • Rules
      • Rules of Old School RuneScape
      • Rules of RuneScape
      • Macro and client features not permitted
      • Rules of Conduct
    • Imprint
Powered by GitBook
LogoLogo

This website and its contents are copyright © 1999 - 2025 Jagex Ltd.

On this page
  • Section A: Illegal Content
  • 1. The number of orders received from Member State authorities
  • 2. The number of notices submitted in accordance with Article 16 by trusted flaggers
  • 3. The number of notices submitted in accordance with Article 16 categorised by the type of alleged illegal content concerned and any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider
  • 4. The number of notices submitted in accordance with Article 16 processed by using automated means
  • 5. The median time for taking action for notices submitted in accordance with Article 16
  • Section B: Content moderation undertaken at Jagex's initiative
  • 1. Information about content moderation engaged at Jagex's initiative
  • 2. Measures taken to provide training and assistance to persons in charge of content moderation
  • 3. Content moderation measures taken at Jagex's initiative during the reporting period
  • 4. The number of complaints received through internal complaint-handling systems
  1. Legal Portal
  2. Annual reports
  3. EU Digital Services Act Transparency Reports

Jagex EU Digital Services Act (DSA) Transparency Report for the period ending 16 February 2025

This transparency report is published pursuant to Article 15 of the DSA and covers intermediary services provided by Jagex. This includes in-game chat services provided in games published by Jagex (“Jagex Products”).

The report reflects the period from 17 February 2024 until 16 February 2025 inclusive.

Section A: Illegal Content

1. The number of orders received from Member State authorities

No such orders received, including orders issued in accordance with Articles 9 and 10.

2. The number of notices submitted in accordance with Article 16 by trusted flaggers

No such notices received.

3. The number of notices submitted in accordance with Article 16 categorised by the type of alleged illegal content concerned and any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider

  • Total notices received: 1,722

    • From EU: 781 (45%)

    • Not EU: 941 (55%)

The geographical location of the report submission is determined by means of limited technical data capture relating to the connection country at the time of report submission. It is possible that reporters from outside the EU could be treated as within EU, if the user connection is via (for example) a Virtual Private Network that tags the network connection point as an EU country. Similarly, an EU user may be classified as not from the EU if they are using a Virtual Private Network that tags the connection point as outside of the EU.

Where reports are classified as ‘Not EU’ the outcomes of the review are not recorded in this transparency report in order not to contaminate the specific EU reporting data. However, the service does investigate, and take action, in relation to some reports submitted from outside the EU.

The breakdown of EU reports by type of illegal content is as follows:

Type of illegal content
Total EU reports
Total Action Taken
Action Taken Based on Breach of Terms
Action Taken Based on Breach of Legality
No Action

Animal welfare

24

0

0

0

24

Consumer information infringements

37

0

0

0

37

Data protection and privacy violations

73

0

0

0

73

Illegal or harmful speech

104

7

7

0

97

Intellectual property infringements

32

0

0

0

32

Negative effects on civic discourse or elections

27

0

0

0

27

Non-consensual behaviour

74

4

2

2

70

Pornography or sexualised content

35

5

5

0

30

Protection of minors

37

6

2

4

31

Real world scams and/or fraud

109

0

0

0

109

Risk for public security

46

2

0

2

44

Self-harm / risk to life

40

3

0

3

37

Unsafe, non-compliant or prohibited products

44

0

0

0

44

Violence

99

2

2

0

97

Total

781

29

18

11

752

Reports with an outcome of ‘no action’ are defined as follows:

  • The illegal content reported could not be located

  • User error in key data needed for investigation (for example, a typo in reported user)

  • User is reporting content on a platform or service not controlled by Jagex

  • The content was identified but not deemed illegal

  • The content was identified but lacked sufficient content to pass legal threshold

  • The users was not reporting illegal content but used the reporting route for other reasons (such as requesting a refund)

4. The number of notices submitted in accordance with Article 16 processed by using automated means

No notices received were processed by automated means.

5. The median time for taking action for notices submitted in accordance with Article 16

  • Median time from receipt of report, review and the outcome communicated to the reporter = 10 hours.

  • Receipt of report, review and the outcome communicated to the reporter within 7 days = 100%.

  • First 48 hours of report receipt analysis:

    • 83% of reports were resolved within 24 hours.

    • 89% of reports were resolved within 36 hours.

    • 93% of reports were resolved within 48 hours.

Section B: Content moderation undertaken at Jagex's initiative

1. Information about content moderation engaged at Jagex's initiative

i. Moderation

Illegal content is not permitted in Jagex Products and we use a combination of automated systems and human review to prevent, identify and remove it from in-game chat and other user generated content. We additionally apply sanctions to accounts which post illegal content and other content prohibited by our terms and conditions. Such sanctions include muting (allowing a user to continue to play but not to use free text chat) or banning (preventing the user from continuing to play) accounts for a temporary or permanent period of time.

In all instances where we apply a temporary or permanent penalty to a game account, the user is notified through two channels: (1) notification at game log in; and (2) a message in the users account message centre.

Users are supplied with evidence that relates to a penalty applied unless:

  • providing evidence may likely act as a tip off to the user that they are under suspicion, for example a user admitting to the possession of child abuse media;

  • providing evidence would disclose detection methods that may inform a user how to negate future detection;

  • the automatic penalty is based on evidence from several sources (for example abuse reports accumulated over a period of time) where inclusion of the multiple sources of evidence is impractical and/or technically prohibitive; or

  • providing evidence would inadvertently disclose the personal data of a third party.

All users can submit an appeal against any decision we have taken in relation to a game account with some limited restrictions. All appeals are read and reviewed by a human moderator, and outcomes are typically provided within 48 hours of submission.

There are a limited number of penalties for which we do not provide an appeal route, in situations where we are entirely confident that a thorough investigation has taken place before the penalty has been applied and where there is no material benefit to further engaging with the user in dialogue. These situations are:

  • a user has committed financial fraud on the platform, for example the use of stolen credit cards to make fraudulent purchases;

  • we believe a user presents a significant threat to the life of other people;

  • we believe a user presents a significant threat towards children;

  • we believe the user is too young to play our games;

  • we have reviewed an illegal content report about a user and are satisfied the reported content passes legal threshold and our technical data indicates the user is responsible for the illegal content;

  • the user has been abusive towards staff including threats to harm and kill;

  • the user is involved in a serious breach of our terms of service, and it would not be possible to offer an appeal route without disclosing the detection methods and proprietary technical indicators used to identify the breach of terms; or

  • a user has already had an account that has received one of the penalties above, and automated systems have automatically excluded subsequent accounts they have created, identified by technical association.

ii. Automated Tools

We use automated anti-cheat technologies in relation to Jagex Products. When users connect online to a game server, these technologies may activate and monitor game play, the files on users’ devices associated with the Jagex Product or that otherwise access our servers, and users’ computer's memory, purely for the purposes of detecting and preventing cheating.

If any of these anti-cheat technologies detect cheating, we may collect relevant information necessary for human investigation and enforcement purposes. The following systems are used:

  1. Systems which monitor in-game chat to detect (i) phrases which are illegal or otherwise prohibited by our Terms (including by our Content Standards Policy) and (ii) behaviour indicative of spamming or other conduct contrary to our Terms (including any game rules incorporated into our terms). Users may face an automatic sanction up to and including a 24-hour mute if such keywords or behaviour are detected.

  2. Systems which monitor player abuse reports, and which may apply automatic sanctions (up to and including a permanent mute).

  3. Systems which monitor in-game chat to detect phrases which may indicate a user is under 13 years old, presents a risk to others or a child protection risk. Where triggered, these systems provide details for human review, following which sanctions up to and including a permanent ban may be applied.

  4. Filters which automatically censor certain offensive words or phrases in in-game chat. Some Jagex Products allow the user to customise the level of filtering.

  5. Common hate terms and top-level domains are hard code filtered in chat channels and cannot be disabled by users.

  6. Checks of character names at account creation which endeavour to block the creation of accounts with offensive or misleading character names.

  7. Checks which block an account from being created and/or played if the user appears to be under the age of 13.

  8. Anti-fraud technology which monitors purchases and payments to block fraudulent or suspicious transactions.

iii. User Tools

We provide the following tools to allow users to customise their online experience and prioritise issues to us to mitigate the likelihood of exposure to harm:

  • User submitted reports for hate language, child protection and real life threats are prioritised for review - typical SLA of 24 hours.

  • Content is not persistent; chat is typically viewable for a few seconds and the user's record of chat is cleared after each session.

  • Users can only private message or form groups by mutual consent.

  • Users can hide their online status and block other users.

  • Users can hide all chats in all channels if desired.

2. Measures taken to provide training and assistance to persons in charge of content moderation

Jagex has a long-standing record of supporting content moderators in their role. In 2023 , Middlesex University conducted academic research with interviews of content moderators which compared their experiences and wellbeing to a control group. They drew 6 conclusions of wellbeing issues that are far more likely to be experienced by content moderators:

  1. Desensitisation - they become numb to the content over time linked to reduced empathy and compassion.

  2. Intrusive Thoughts - it can be hard to switch off from work and many found that they were thinking about things they had seen or heard at work even when they didn't want to.

  3. Hypervigilance - looking for hidden dangers or thought something bad might happen in situations where others thought everything was fine.

  4. Cynicism - hard to remain hopeful and optimistic and expecting the worst of others and the world.

  5. Emotionally Drained - the work is emotionally and physically exhausting, many had problems getting or staying asleep.

  6. Intimacy - They found it hard to form intimate relationships or talk to loved ones about the job or the emotional effects.

References:

  • Robust onboarding, training and process guidelines.

  • Regular refresher training.

  • Scheduled 121 sessions with Managers that include mandatory wellbeing checks.

  • ‘No quibble’ breaks if needed.

  • Variance in work types to reduce fatigue and the risk of desensitisation.

  • Sharing of positive feedback and promote the value in the work done.

  • Private medical care and 24/7 access to live chat medical professionals.

  • Monthly moderation wellbeing drop in sessions.

  • Access to professional counselling.

  • Training for managers to enable them to identify and support mental health issues.

  • 'Opt out’ of content moderation work with no financial or job role impact.

  • 24/7 second line support for agents to sense check decisions and act as a ‘sounding board’.

  • Access to a range Employee Health Services.

  • Internal platforms to encourage peer to peer support.

  • Zero tolerance approach to staff abuse.

  • Flexible work patterns / work – life balancing.

3. Content moderation measures taken at Jagex's initiative during the reporting period

Excluding illegal content reports (outlined in the table at section A.3. above) human moderators applied 50,056 account offences in the reporting period because of chat-based rule breaking identified through abuse reports provided by users.

Additionally, during the reporting period the following sanctions were applied by the automated system triggers for (1) and (2) listed under section B.1.ii. above:

  • Permanent mute - 3,664

  • Temporary mute - 35,141

Across all appealable penalty types (including non-content moderation measures) the appeal outcomes are:

  • Appeals denied - 174,779 (71.28%)

  • Appeals granted - 70,453 (28.72%)

The high percentage of appeals granted is a consequence of bad actors engaging in unacceptable and/or unlawful conduct on hijacked accounts. If the account owner subsequently regains control of the account and submits an appeal for historical penalties, we generally grant the appeal on the basis that the account user was not in control of the account at the time of, and therefore not responsible for the offences. The high ‘grant’ rate of appeals is therefore not indicative of the accuracy of detection and penalty application, but a consequence of the high occurrence of unacceptable or unlawful conduct following an account hijacking.

In isolation, the accuracy of automated and manual actions is above 95%, if measured by ‘was the penalty/action correct’, but not allowing for ‘was the account creator in control of the account at the time’.

4. The number of complaints received through internal complaint-handling systems

Jagex operates a complaints process under section 24 of our Terms and Conditions. Typically, these complaints take the form of a user admitting they have breached the terms and requesting leniency or ‘another chance’. No formal record is kept of these types of complaints and no outcome or response is provided to the user.

If a user has sent a complaint in respect of a sanction applied to their account, but has not already used our online service to submit an appeal, they are contacted by email (if they have provided a contact address) with details of how to appeal. No further review is carried out outside the dedicated online appeal route.

Last updated 22 days ago

Based on this insight, Jagex has introduced a suite of measures to protect and support its content moderators, which includes but is not limited to the following:

https://repository.mdx.ac.uk/item/8q4yx
https://repository.mdx.ac.uk/download/35839979d59c54ca6e4f56b7b68235d11ad04c3a93377a876ef99a7d68e6e55e/422344/spence_coping.pdf