Trust Safety & Content Moderation Management

Overview
Gain the in-depth technical knowledge and skills to lead trust and safety teams in the fast-growing Big Tech regulatory environment.

This programme is aimed at early-stage trust, safety and content moderation professionals who are keen to progress into management and team leader roles in the sector. Throughout the programme, learners develop their problem-solving, critical thinking, communications and project management skills. Learners are also taught how to deploy and operate relevant software tools and gain a thorough understanding of professional self-caring practices.

On programme completion, learners will have developed:
- A detailed knowledge of contemporary transnational trust and safety regulatory and compliance issues, as well as cross-industry community platform standards and ethical approaches.

- An understanding of the international online cultural environment, including the sociopolitical challenges this has produced for modern society.

- A core competency in understanding today's online threat landscape and initiating online investigations into bad actors.

- An understanding of key project management theories and their practical application and uses in a Trust and Safety working environment.

- A core competency in a range of software applications.

- An understanding of the key principles and uses of data analysis techniques in project management.

- An enhanced range of soft skills including presentation and communication skills, report writing, critical thinking, problem solving and self-care practices.

Course Highlights
The programme deepens and broadens learners' existing skills with competencies in project management, appropriate deployment and operation of software packages, regulation and compliance, problem solving, communications, and critical thinking with accredited transferrable proficiencies in these areas. Learners develop the skills and competencies to build their resilience, specifically, in how to look after themselves psychologically when dealing with traumatic, scarring or abusive content. They also work on the application of these principles within their wider team and workplace settings.

Subjects taught

This course comprises nine modules taught across three semesters.

Course Modules
Data Processing Technologies (5 ECTS)
Regulation and Compliance (5 ECTS)
Cultural and Political Evolution of Social Media (5 ECTS)
Applied Project Management for Trust and Safety Professionals (10 ECTS)
Data Management Technologies (5 ECTS)
Platform Standards and Ethics (5 ECTS)
Self-Caring Practice in Occupational Contexts (10 ECTS)
Communication and Thinking Skills (10 ECTS)
Digital Investigations (5 ECTS)

Entry requirements

This programme is aimed at early-stage trust, safety and content moderation professionals who hold Level 8 qualifications (with a 2.2 honours) and wish to achieve a Level 9 qualification in trust and safety and content moderation management. To facilitate the diversity of workers in the area, who range from Level 8 graduates in communications, languages, psychology and business studies, entry to the programme is for non-cognate NFQ level 8 qualification holders.

Application dates

How to Apply
Directly to Griffith College.

Duration

1 year part-time.

More details
  • Qualification letters

    PgDip

  • Qualifications

    Postgraduate Diploma (Level 9 NFQ)

  • Attendance type

    Part time

  • Apply to

    Course provider