Why cyberbullying is more than just an online safety issue

28 Oct 2020

Tijana Milosevic, a research fellow at DCU and the SFI Adapt Centre. Image: Tijana Milosevic

Tijana Milosevic of the SFI Adapt Centre and DCU is working to better understand how cyberbullying affects children and the role of AI in addressing abuse.

Tijana Milosevic received a PhD in communication from the American University in Washington DC in 2015. She went on to become a postdoctoral researcher at the Department of Media and Communication at the University of Oslo in Norway, before joining the National Anti-Bullying Research and Resource Centre (ABC) at the Dublin City University (DCU) Institute of Education in July of last year.

In August 2020 she started as a research fellow jointly appointed with ABC and Science Foundation Ireland’s Adapt Centre. She is also a co-principal investigator on a Facebook-funded project that examines children’s views about the effectiveness of a natural language processing-based tool for detecting cyberbullying on social media platforms.

‘Thinking about cyberbullying only as an online safety issue can limit our thinking about how to approach the problem’

What inspired you to become a researcher?

Before starting my PhD programme, I worked as a teacher in primary school and middle school. During recess, I would observe children’s social interactions and the role of smartphones and social media in this process.

Some children were excluded or unpopular for no apparent reason, and I was particularly fascinated with the process of social positioning – how children established their identity in a social group and the role of social media in this process.

This work largely influenced my desire to study cyberbullying, but my own personal experiences as a sensitive child trying to understand other children and the world around me, certainly played a role too.

Can you tell us about the research you’re currently working on?

My work is very interdisciplinary and it takes place in collaboration with colleagues from the field of not just communication, but also psychology, sociology, anthropology and computer science, among others. I am currently working on several projects, one of them relates to children’s and families’ digital media use during Covid times.

We are interested in whether children have experienced more online risks or less during lockdown, and how greater reliance on digital technology influences their and their families’ overall wellbeing. This is a project that I am working on with my colleagues from the National Anti-Bullying Research and Resource Centre, Prof James O’Higgins Norman and Derek Laffan. This is an international comparative project, taking place in more than 10 European countries.

My other current research broadly examines the role of AI in addressing abuse and cyberbullying on social media and digital messaging platforms. This work has evolved from my examination of social media companies’ cyberbullying policies, where I studied reactive moderation.

This mostly relies on interviews with company representatives and with other stakeholders, such as policy makers, charities or non-governmental organisations in the e-safety community. Currently, the focus is more on proactive moderation and we are primarily examining children’s views on the effectiveness of this process when it comes to keeping them safe online.

It is also about the balance of children’s rights to safety on the one hand, but also privacy and freedom of expression on the other. Ensuring that children’s rights, as stipulated in the United Nations Convention on the Rights of the Child, are honoured is one of the goals of this process.

As part of the Elite-S project, I aim to drive the conversation on standard-setting when it comes to the use of AI to address cyberbullying among children in the direction that would embed the consideration of the need for a balance of children’s rights. My Adapt-based colleague on this work, who complements it with his expertise in computational science, is Dr Brian Davis, assistant professor at DCU School of Computing.

In your opinion, why is your research important?

I see its importance in as much as it is able to give a voice to children, providing an avenue for them to give feedback as to how the technology that has implications for their wellbeing is designed.

My research should inform regulation and policy as well, and I see its importance in being able to hold the industry accountable to the need to design technology with children’s wellbeing in mind. Such research is particularly salient at this time when designing regulation of online harms is in focus globally, including in Ireland with the Online Safety and Media Regulation Bill.

What are some of the biggest challenges you face as a researcher in your field?

Perhaps at this point it is a relative lack of truly interdisciplinary collaboration, outside of corporate research, between computational scientists and social scientists and humanities scholars, specifically addressing my topic of research.

Another one is a relative lack of datasets to work with and generally the time-consuming nature of the process of undertaking research with children, coupled with a demand to produce solutions and publish quickly.

I find this often to be counterproductive. Good research takes thinking, planning, ethical considerations and time. The system often does not make room for this.

Are there any common misconceptions about this area of research?

Cyberbullying is often seen only as an online safety issue. While this is certainly the case, cyberbullying is also a relational issue. The problem can be related to the above mentioned process of social positioning, which I would like to explore further in my future work.

Thinking about cyberbullying only as an online safety issue can limit our thinking about how to approach the problem.

What are some of the areas of research you’d like to see tackled in the years ahead?

I would like to further explore the process of social positioning on social media and the role of technological affordances of various platforms in this process and how it links to cyberbullying. This includes how algorithmic curation affects this process and the role of platform moderation in managing outcomes.

Are you a researcher with an interesting project to share? Let us know by emailing editorial@siliconrepublic.com with the subject line ‘Science Uncovered’.