The pandemic of conspiracies in the covid-19 age: How twitter reinforces online infodemic

The pandemic has accelerated the pervasiveness of social media as tools to obviate, in times of forced distancing, the need for social relations. As Deborah Lupton notes, digital media played a much more important role in the COVID-19 phase than in the 1990s and the HIV/AIDS emergency;however they have also contributed to the spread of misinformation and fake news, often characterized by conspiracy-type narratives The investigation, carried out in line with the Digital Methods approach, analyses how a popular conspiracy theory on Twitter-the flat Earth theory – activates and reinforces the spread of other intertwined conspiracies by exploiting some popular hashtags used as popularity multipliers. The essay analyses the role of Twitter in reinforcing informational cascades related to multiple conspiracies such as the flat Earth, the COVID-19 /5G and the no-vax theories. Moreover, the analysed contents reveal a significant polarisation identified by hate content and an aggressive lexicon used both by conspiracy supporters and by those who tend to contrast them. © 2021 by authors;licensee OJCMT.


INTRODUCTION
The pandemic has accelerated the pervasiveness of social media as ideal environments to obviate the need for relationships in times of forced distancing: as Deborah Lupton observes, in the COVID-19 phase digital media played a very important role compared to the 90s and the HIV / AIDS emergency; however, they also enabled the enormous spread of false news often characterized by conspiracy-type narratives. One of the salient dimensions of the recent pandemic has been the disinformation increase and social impacts (Mian & Khan, 2020): the World Health Organization has defined the recent crisis not only a pandemic but an infodemic, a pandemic of false information or without any scientific evidence such as the alleged links between COVID-19 and 5g technologies, or the effects of pseudo-cures advertised by prominent figures in public life: this is the case of the supposed benefits of chloroquine and hydroxychloroquine, mentioned among others by the famous entrepreneur Elon Musk and the former US President Donald Trump (Limaye et al., 2020;Liu et al., 2020). It has also been proven that the infodemic has had significant repercussions by increasing prejudices and social stigma toward citizens of Asian origin considered "vectors" or even "responsible" for the spread of the virus and weaponizing the implementation of social policies aimed at the most fragile citizens.
Several researches investigated, through an automatic social media contents analysis, the spread of the infodemic in conjunction with the evolution of the health emergency, identifying multiple forms of misinformation. The research of Islam et al. (2020) for example, focused on three types of misinformation circulating on Facebook, Twitter, in online newspapers, including fact-checking agency websites, and which can have serious implications for public health: they are identified as rumors, contents that stigmatize individuals or institutions and conspiracy theories. The data collected between 31 December 2019 and 5 April 2020 were analyzed in a descriptive and qualitative way. The study performed a content analysis of current affairs articles to compare data collected from other sources, identifying 2,311 reports of rumors, stigma and conspiracy theories in 25 languages and from 87 countries. The contents were attributable to disease, transmission and mortality related to , containment measures (21%), treatment and cure (19%), the causes of the virus including geographic origin (15%), violence (1%) and various news related to the pandemic (20%). Of the 2,276 reports for which textual evaluations were available, 1,856 claims were false (82%). Such disinformation fueled by rumors, statements aimed at stigmatizing other individuals or groups and conspiracy theories, can have potentially serious implications for the individual and the community, if it prevails over evidence-based news. It should also be noted that various studies have ascertained that disinformation, in its various forms, does not have as its privileged sources only private citizens or independent organizations but, and this is a very important aspect, it is often the "official" sources or the accredited online newspapers, to contribute to the increase in online misinformation; at least the data reveal that official sources contribute as much as independent sites to the infodemic increase. Cinelli et al. (2020) carried out an analysis on various social networks such as Twitter, Instagram, YouTube, Reddit and Gab by studying the dissemination of content connected to COVID-19 and revealed, following the model of the information epidemic, that the volume of disinformation produced from reliable sources does not differ much from that attributable to alternative and unreliable sources.

THE CONSPIRACIES PANDEMIC
According to another authoritative survey, it seems that conspiracy theories are among the most widespread fake news in times of pandemics, often by sources considered reliable such as the mainstream media (Papakyriakopoulos et al., 2020). Between January and March 2020, researchers identified 11,023 unique URLs -where URLs represent online sources of information -referring to the causes of Covid 19 and appearing in 267,084 posts across Facebook, Twitter, Reddit and 4chan. The researchers found that among these URLs, alternative sources had generated more information aimed at bolstering conspiracy theories than traditional sources. However, conspiracy stories from traditional sources reached far more users. In addition, the researchers further quantified the dynamics of conspiracy theories in the social media ecosystem, noting that stories that reinforced conspiracies generally had a greater virality than neutral stories or those aimed at debunking or reporting facts discrediting them (Ib. p. 2).
Social media such as Twitter, Facebook, Reddit etc. have a fundamental role both in spreading fake news and enhancing its visibility (Allington et al., 2020;Shahsavari et al., 2020). On 4 May 2020 the 26-minute video titled "Plandemic'' posted on YouTube, in which it was claimed that COVID-19 was a laboratory product conceived to guarantee an enormous income to pharmaceutical giants from the vaccine, received 2.6 million likes in a few hours 1 . Italy was one of the countries most affected by COVID-19 and a recent survey realised by National Research Council (CNR) on about 140,000 subjects, observes that in the health emergency phase, the use of social media by Italians has significantly increased and that a significant part of the monitored sample uses social platforms to "read online what the news is hiding from us" (Tintori et al., 2020, p. 14).
Despite the multiplicity and diversity of conspiracy theories present online, their arguments are structured on recurring rhetoric. At the base of the conspiracy theories there is a progressive polarisation between the conspiracists -those who identify the cause of a phenomenon in the conspiracy -and the trickster: the real conspirators often identified in the techno-scientific, economic, political elites (Neville-Shepard, 2018). In the analysis of conspiratorial texts, it is necessary to identify, together with the speeches, also those who elaborate them. These issuers stage a typical dramaturgy (Wexler & Havers, 2002) which includes at least four figures in its cast: the conspiracy theorists; the accused power elite (including public institutions, official agencies and debunkers); the witnesses and experts the conspiracy theorists rely on; the audience (formed by the public of the wider society). As a result, the phenomenon of conspiracy is very broad and often ignores a single theory, presenting itself as a narrative format that targets different theories or subjects, achieving an extraordinary echo.

THE RESEARCH FOCUS: THE ROLE OF SOCIAL MEDIA IN THE SPREAD OF CONSPIRACY CONTENTS
The processes of formation and circulation of conspiracy theories have exponentially strengthened thanks to the Internet. In fact, online communication emphasises the intertextuality of conspiratorial language that is the property of each speech to refer to other discourses from heterogeneous sources -politics, religion, economics, science, etc. -with which it is eclectically assembled by web users (Panchenko, 2016). In recent years, the flat Earth theories 2 gained a certain popularity thanks to social media; they are based on antiscientist positions and accuse institutions such as NASA or the United Nations of acting as lobbies hiding truths relating to the shape of planet Earth or historical events such as the 1969 moon landing. Conspiracy theories are also often characterised by hateful content to the extent that they contribute to polarise individuals and groups on extremist positions: a significant example are the COVID-19 denialists rejecting the restrictive measures put in place by governments to contain the pandemic, and who often aggressively target the experts: virologists or representatives of institutions. The same register can be seen in the no-vax movement who reject the use of anti-covid vaccines because they are produced by the alleged big pharma complex which would make citizens dependent on their economic and technological power.
What emerges from the empirical analysis of these conspiracies -apparently very different from each other -is the role of social media and in particular of Twitter, to amplify their spread thanks to the multiplicative and synergistic effect derived from the affordances of the platforms. Conspiracy narratives are in fact nourished by the informational cascades (Sunstein, 2017, p. 98) typical of social networks. They consist of information exchange processes by virtue of which the belief that a fact is true spreads simply because other people seem to believe it.

As Cass Sunstein argues:
"-In an informational cascade, people cease relying at a certain point on their private information or opinions. They decide instead on the basis of the signals conveyed by others. It follows that the behavior of the first few people, or even one, can in theory produce similar behavior from countless followers" -(op. cit., p. 99).
Informational cascades are difficult to prevent and become more treacherous when they can reach large numbers of people with a single click. Today they present themselves as a social media by-product spreading fake news or, in the case of conspiracy theories, contents aimed at discrediting mainstream positions. Furthermore, the informational cascades can feed each other and are often generated by different conspiracy narratives: proof of this is the instrumental use of popular hashtags (e.g., those linked to former President Donald Trump) to spread conspiracy theories that have nothing to do with the President but manipulate his visibility to increase their impact. Conversely, it is often the same public figures -including Trump himselfwho exploit the conspiracy informational cascades to increase their online visibility and to get a wider consensus. This is the case, for example, of the QAnon conspiracy, initially launched on the 4chan and 8chan and then amplified on Twitter and other mainstream platforms by the official Trump profile, which claims the existence of a conspiracy by exponents of the American Democratic Party who act as an occult power group to the detriment of the Republicans (Zuckerman, 2019). This conspiracy has recently been involved in the movements assaulting Capitol Hill on 6 January 2021 3 .
Social networks represent the ideal meeting environment for conspiracies supporters: individuals, who in the pre-internet era were isolated, gained the means to connect with others who share their interests and are able to create aggregation to strengthen the in-group bonds. However, it's worth noting differences, for example, in the conspiracy theories groups on Facebook: some of them tend to isolate themselves from external interference while others are quite active in proselytism. Within Facebook, the distinction is identified in the privacy management methods used by groups: some create private groups, accessible only by those who receive authorisation by an administrator. In this context, private groups become the most similar realisation ever to what has so far been defined as echo-chamber. According to Axel Bruns (2017: 3): "An echo chamber comes into being where a group of participants choose to preferentially connect with each other, to the exclusion of outsiders. The more fully formed this network is (that is, the more connections are created within the group, and the more connections with outsiders are severed), the more isolated from the introduction of outside views is the group, while the views of its members are able to circulate widely within it".
However, public groups aimed at proselytising represent the largest category: they collect a higher number of subscriptions and are the virtual arena in which the debate between supporters and detractors takes place. These pages and groups are also used to spread propaganda mainly through audiovisual contents. There is evidence in fact that the YouTube algorithm has significantly contributed to the spread of this conspiracy theory. In an interview on 30 July 2019, the BBC gathered the opinion of Guillaume Chaslot, an ex-YouTube insider. He revealed how the algorithm he conceived offered the user the "best video" or the one that -based on YouTube's strategy -could hold his attention the most on the screen. By doing so, the user would have viewed additional contents and would have been exposed to a greater number of advertisements (Lewis & McCormick, 2018). The result was that the videos explaining the flat Earth were far more suggestive than those describing the spherical Earth: this disparity evidently increased the possibility of exposing a significant number of users to conspiracy theories.
The investigation presented starts from a well-established conspiracy theory on social media -the flat Earth -to observe the ways in which it attracts and at the same time reinforces the spread of other conspiracies -the one that refers to the causes of COVID-19 to 5G networks for example -by exploiting some popular hashtags that work as visibility multipliers. This investigation was solicited by the spread on Twitter of the ambiguous #flattenthecurve: a hashtag used in flat Earth since 2017 and which then at the beginning of the first lockdown in March 2020 was resumed in reference to the trend of COVID-19 assuming a new and unexpected visibility. Drawing from the tweets analysed the essay addresses the following research questions: What are the main hashtags related to a popular conspiracy theory such as flat Earth?
Do they refer to different conspiracies?
What are the functional relationships among different conspiracies?

METHODOLOGY
The methodology used refers to the field of "Digital Methods Analysis" (Rogers, 2019) which falls within the broader framework of Computational Social Sciences. Thanks to the use of software tools such as TCAT (Twitter Capture & Analysis Tool) 4 and Gephi which will be further described in detail, we collected and analysed a database of Twitter contents relating to flat Earth between 12 December 2019 and 6 April 2020. We started our data collection from a well-known conspiracy theory -flat Earth -because in this way we had the chance to gather a significant amount of conspirational data that, according to our premises, could easily lead us toward multiple and different conspiracy narratives. We started our TCAT analysis with three main keywords -flatearth, researchflatearth, earthisflat -as specific references to the flat Earth conspiracy movement. The dataset collected contains over 110,000 tweets belonging to over 82,000 different users. Our goal was to use the database to analyse, through tweets, keywords and hashtags, to what extent the different conspiracy plots intertwine on Twitter. However, starting from March 2020, multiple conspiracy contents relating to coronavirus and COVID-19 emerged. Since then, flat Earth, coronavirus, 5G, COVID-19 deniers and the no-vax movement -started to intertwine indiscriminately within the Twitter digital environment. The hashtag #flattenthecurve, ambiguously used both in relation to flat Earth and to COVID-19, has been the most significant trait d'union among the different conspiracies.
Before describing the different phases of the contents analysis, however, it is necessary to deepen the methodological framework of reference and the potential of Digital Methods Analysis.

The Digital Methods Approach and Methodological Challenges
2020 has been an extraordinary year for the Internet, and the global lockdown caused an impressive surge in the volume of data produced online. The volume of online traces has favored the development of Computational Social Science, a relatively new research approach which is based on computer softwares used to analyse social phenomena. The definition of "new" is due to the fact that it has introduced new approaches to the scientific method which, while it was once based solely on evidence, can now rely on something less tangible such as computer simulations and Big Data (Anderson, 2008). The management of Big Data however requires extraordinary computational power; consequently, this type of analysis is not accessible to every researcher (Laser et al., 2020). Alternatively, there are different solutions in the academic field that rely on Digital Methods: "Digital Methods are techniques for the study of societal change and cultural condition with online data. They make use of available digital objects such as the hyperlink, tag, timestamp, like, share and retweet, and seek to learn from how the objects are treated by the methods built into the dominant devices online [..]" (Rogers,Ib.,p. 3

)".
With regard to Twitter, the basic analysis approach consists in identifying the main entities and the relationships between them, where by entity we mean Users, Hashtags, audiovisual material, external links etc. This can be done thanks to different software tools freely available online such as TCAT. TCAT is a software developed by the Dutch research group DMI (Digital Methods Initiative) that provides different modes of data capture: Word-Tracking, Follow-User, 1%-Sample and Geo-Tracking. TCAT supports several capture modes named "Word Track", "Follow User" and "1% Sample". The Word Track mode allows, by specifying a word or a group of words, to capture the Tweets containing the latter in the text field. You can search up to a maximum of 400 words at the same time and the total volume of traffic captured (number of Tweets added to the database every minute) can never exceed 1% of the total traffic on the platform in the same amount of time. The Follow User mode allows you to create a database containing all the Tweets of a particular user or group of users, allowing you to follow up to 5000. This mode is extremely useful if you want to follow a homogeneous group of people such as members of the Senate or Ministers of a Government. Finally, 1% Sample returns a collection of randomly sampled Tweets whose volume is equal to 1% of the total traffic traveling on the platform. The three modes described so far cannot be used at the same time, a choice must be made when installing the application.
Among the options described, we decided to focus on Word-Tracking which allowed us to capture Tweets containing one or more previously-defined keywords. This mode allows the creation of large collections of data, which became fundamental to realise an extensive and articulated content analysis through hashtag and keywords. This mode is useful to analyse communication and networks between users, in order to identify the more frequent contents (text, images, video, external links) and the hashtags used to make them viral. Similar contents can be grouped in semantic categories; those categories can then be related to multiple narratives such as for example conspiracy narratives. To increase its analysis capacity, TCAT must be assisted by Gephi (https://gephi.org/), an open source software that allows for an analysis on user networks and is capable of relating entities of different nature, enabling one to identify niches and informational contents, and underlining the relationships between different entities such as user and the media they share or the hashtags they use. Gephi allows one to manage arcs and nodes manually and also provides sorting algorithms and data filtering options to obtain a clean and intuitive graphic representation.

Analysing the Flat Earth twittersphere and its Main Hashtags (RQ1)
DMI-TCAT provides analytical techniques to researchers to combine ease of use and flexibility through different data selection possibilities: Tweet statistics, activity metrics, network analysis and content analysis, and also offers some tools for geo-referencing and ethnographic research. With the purpose of finding secondary narratives, the text of the 110,000 tweets is examined in search of words and references that allow them to be classified as related to a more specific topic, so as to identify the most discussed ones. The major limitation lies in the fact that the lone text does not allow to classify the content of the Tweet as for or against a specific theory. For example, the phrase "covid is an hoax", can be inserted in opposite contexts: "I truly believe that covid is an hoax" or "the idea that covid is an hoax is really stupid" making difficult to distinguish users who adhere to the conspiracy theory from those who do not. Another issue is that often the conspiracy topic is used ironically simply to enhance the visibility of content, using hashtags to reach as many people as possible within the platform. In consideration of those shortcomings, we decided to rely or analysis on hashtags -which provide less ambiguous semantic meanings -rather than on the tweets textual contents as a whole. The hashtag infact is a founding element of the dynamics of Twitter which started using it in August 2007 before the other social networks. The hashtag is configured not only as a textual content but is the real "label" of the message, it is the sign of recognition, the element that can be replicated and which serves precisely to refer the specific content to a wider semantic category to which the Tweet wants to refer. Moreover, as indicated above, the hashtag is used even if the text of the Tweet has nothing to do with the label that defines it. The same Trend Topic indicated by the platform is defined by the most used hashtags within it (Baym, 2015;Chang, 2010).
The importance of the hashtag within the Twitter platform lead us to choose to use it in order to identify the narratives. To do so, the 35 main hashtags were chosen in order to define a starting point of the survey.
Once highlighted, the main hashtags were grouped by theme and used as research parameter to extend the sub-group relative to the defined topic: for example, the hashtag #Covid19 has been included in a subgroup characterised by the pandemic theme, which will be expanded by selecting within the database, the Tweets containing the hashtag #Covid19, but also the Covid19, Covid_19 and COVID-19 keywords. The tools made available by TCAT will therefore be exploited to analyse certain characteristics of the subsets and verify the existence of relationships between them. The database queries will be carried out using the English lexicon, as over 92% of the sample is represented by Tweets in English. This peculiarity is consequently also found in the list of hashtags just formulated, but it was observed that the hashtags were also used by nonnative English speakers. In some cases it is fundamental to strictly define the linguistic target before the start of the data capture: the choice of the keywords could affect the result of the queries. Particularly, it is important to avoid words that could have a different meaning in other languages, or acronyms that could lead our research off topic, filling our dataset with misrepresenting information.

Research results: the conspiracy cluster (RQ2)
The next step was to focus on a particular partition of our dataset: the conspiracy cluster that is related to hashtags such as #conspiracy, #wakeup, #climatechange, #truth, #hoax. These keywords were used as research parameters of our collection of data with the aim to identify Tweets containing conspiracy topics.
The subgroup of conspiracy theories put together 10'792 Tweets (9.74% of the total) produced by 9'228 different profiles (11.97%). The biggest slice of these Tweets refers generically to conspiracy theories; using other tools available with TCAT (hashtag frequency tool and word count tool) it has been possible to identify other conspiracies clearly cited in this cluster. The most mentioned are: • No-vax movement with 2901 quotes from 2705 users; • COVID-19 and 5G technology with 1066 quotes from 1013 users; Drawing from these evidences found in the flat Earth conspiracy environment, the analysis will focus now on the conspiracies related to the COVID-19 pandemic.
The no-vax movement argues without any scientific evidence that vaccines are harmful to health and cause disease. Among their main arguments is the belief that pharmaceutical companies conceive vaccines for the sole purpose of profit, a belief that is periodically fueled by plots and fake news. The conspiracy narrative is carried forward through videos and propaganda articles claiming that the cause of the COVID-19 disease is related to radiation from 5G antennas 5 . The two conspiracy theories seem to have mixed and selffueled within the platform and have found a definitive point of convergence by combining those who have always been against vaccination with those who claim that the pandemic is created ad hoc to enrich the pharmaceutical lobbies.
The following sections will specifically describe the analysis highlighting the relationships between flat Earth theories, COVID-19 denial theories and no-vax theories.

Research results: analysing the relations among multiple conspiracies (RQ3)
Similarly, as for the conspiracy cluster, from the broader database we created a data collection related to COVID-19 and the no-vax movement, containing approximately 9,150 Tweets. The small sample does not allow to elaborate significant results from a quantitative point of view, but it is possible to exploit the Gephi software (https://gephi.org/) in order to extract specific information by analysing the hashtags, for example. Thanks to this, it was possible to create a so called co-hashtag graph that relates hashtags to each other: each is represented by a node and when two hashtags are used in the same Tweet, the nodes are joined by an arc. The weight of the arc is represented by its thickness, which indicates how many times the two hashtags have been used together (Figure 2).
To achieve this, the hashtag #flatearth was removed from the graph because the greater weight of its arches would have hidden the relationships existing between the aforementioned hashtags. The graph relating to the hashtag analysis therefore highlights the relationships existing between some hashtags such as #covid19 and #flattenthecurve, between #researchflatearth, #coronahoax and #coronavirushoax, among #corona #conspiracy #covid and finally between #flatearthers and #antivaxxers. We will now analyse these groupings individually.

#flattenthecurve & #covid19
The hashtag analysis revealed a significant relationship between the two main narratives taken into consideration, namely flat Earth and conspiracies related to COVID-19 thanks to the ambiguous use of #flattenthecurve. The origin of the hashtag on Twitter is due to a flat-earth profile of the same name, called @flattenthecurve. The user started using the hashtag in September 2017 and in the following years it has been spread in small flat-Earth niches. In March 2020 #flattenthecurve went on trend on Twitter as it was used in reference to the curve of COVID-19 infections. Therefore, by focusing on the hashtags #flattenthecurve and the similar #flatteningthecurve, we detected 238 Tweets in our database in which at least one of the two hashtags has been used. The small number is due to the fact that for secondary reasons the capture is interrupted at the topical moment of thetrend. The main evidence is that an important slice of these Tweets belongs to bot profiles that are easily identified using Gephi by relating Users to the Sources of their Tweets (bot_libre!, botboibodidio, twittbot.net) 6 . The metadata analysis shows that from March 9 onwards, some of these bots start a systematic use in an attempt to exploit the trend and relaunch their content to have more visibility on the platform. In fact, the analysis of a bipartite user-hashtag graph with Gephi highlights the strong relationship between the hashtag #flattenthecurve and the bots. Exploring the graph could be challenging because sometimes the most interesting data is not part of the bigger cluster but can be found in smaller ones. In these cases the use of filters is fundamental to accelerate the researcher work.

#researchflatearth -#coronahoax -#coronavirushoax
The correlation between the hashtags "researchflatearth", "coronahoax" and "coronavirushoax" suggests that the most radical part of the flat Earth movement also embraced the conspiracy instances regarding the coronavirus. In fact, these hashtags are not intended as simple labels but represent real messages for conspiracy theorists: #researchflatearth seems to encourage users to search for evidence of the origin of the Earth, while through #coronahoax and #coronavirushoax they make clear the belief that the global pandemic of COVID-19 is a hoax. 6 An Internet bot, web robot, robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet. https://en.wikipedia.org/wiki/Internet_bot Online J. Commun. Media Technol., 11(4), e202120 9 / 11 This group was composed of a few hundred Tweets: this limited amount of data allowed us to conduct a qualitative content analysis on each Tweet identified. This exploratory analysis revealed multiple relations and common elements among the different conspiracy-minded or conspiracy related content profiles such as: • The attempt to insinuate doubt about the official history; • The dogmatic conviction that sees their point of view as an absolute truth; • The sharing of audiovisual material in support of their beliefs.
These hashtags aren't the only ones being used to discuss the COVID-19 conspiracy. By examining the hashtags #Covid, #Conspiracy and #Corona, the thickness of the arches suggests how these hashtags are used to form the following pairs: #Covid #Conspiracy and #Corona #Conspiracy. In this case, however, by analysing the sampled Tweets individually, it is clear that the hashtag pairs are used by both conspiracy theorists and non-conspiracy theorists as simple labels in order to comment on the topic.
It is also common for conspiracy theorists to share audiovisual material, often reported in other below the radar platforms, such as the video blogging site BitChute (www.bitchute.com) known for hosting conspiracy and extremist contents, which can lead other users to the conspiratorial rabbit hole.

#flatearthers & #antivaxxers
Similar reasons can be applied to the next pair of hashtags: #flatearthers and #antivaxxers. Also in this case the two hashtags are used as labels in order to send opposite messages: on one hand there are those who label their thoughts and their belonging to the two movements with the hashtags #flatearther and #antivaxxer; on the other there are users who use the same hashtags to make fun of the adherents of both conspiracy theories. The two counterparts though contribute to fuel the narrative, spreading the conspiracy and making it reach new users.
The Twitter conspiratorial sphere is a very polarised environment in which verbal clashes are frequent. With regard to the COVID-19 topic, an eloquent example is the spread of the label #COVIDIOTS. Its peculiarity lies in the fact that it is used by both sides with opposite meanings: for conspiracy theorists it serves to insult those who adhere to the mainstream positions about the pandemic, and at the same time it is used by nonconspiracy theorists to insult those who adhere conspiracy thinking about the nature and spread of the virus.
The hashtag has therefore pushed the analysis towards the search for words referable to a vulgar vocabulary attributable to the hate speech of highly polarised environments. The dataset taken into consideration is the one used previously for the hashtag analysis, that is the cluster related to COVID-19 and the no-vax movement within the database related to the flat Earth topic.
For this analysis we come across a particular TCAT feature called "Word Count", which takes into consideration the text field of the Tweets belonging to the dataset and counts the frequency of all words, thus returning the ranking of the most used words.
At this point it was necessary to proceed in a non-automated manner and personally select the hate-words from the list; the most frequent are for example: fool, dumb, stupid, hate, morons etc. In this way, we obtained a significant sample of words in our dataset that we can use as a search parameter to define the percentage of Tweets that contain vulgar words. Since words are in English, in order to maintain as much coherence as possible, we have restricted the dataset to English tweets. In addition, among the 9,000+ Tweets we did not consider Retweets as viral content, since they could have falsified the data. As a result, among 2086 original Tweets, 453 contain at least one word of the hate dictionary defined by us, corresponding to 21.7% of the total, i.e., about 1 out of 5 Tweets.

CONCLUSIONS AND RECOMMENDATIONS FOR FUTURE RESEARCH
The analysis tries to demonstrate the synergistic relationship between different conspiracy contents on Twitter, investigating the combined use of popular hashtags such as that of flat Earth and those related to COVID-19, the anti-vax movement and deniers of the pandemic. At a quantitative level, the data analysed cannot reveal significant informational cascades, nevertheless they highlight the ambiguous relationships between multiple conspiracies fueling a conspiracy cluster that dramatically pollutes the public sphere of social media. The phenomenon has been identified by many authors in relation with different conspiracies. Önnerfors and Krouwel (2021) observe the mixed effect of different conspiracist narratives related to Europe as a challenging threat to the political unity especially in time of pandemic: the authors emphasize how conspiracy 'black box' containing two intertwined narrative strands, would provide both rational explanations and moral judgments concerning existential crises affecting the continent over the last decade. Conti et al. (2020) focus on the evolving interplay between social media and the international health security, analyzing how fake news and disinformation could affect negatively a timely information about the most recent pandemic spread (Ebola, NH1 etc.), while Shahsavari et al. (Ib.) identifies the multiple and different conspiracist narratives related to COVID 19 such as 5g, Bill Gates etc.
Moreover, the tweet's content analysis indicates that in addition to fake news, there is a significant polarisation revealed by hate content and the use of an aggressive lexicon used both by "conspiracy theorists" and by those who want to ridicule them.
The paper outlines a new approach based on Digital methods to the study of infodemia. On the basis of our research we are able to formulate a set of recommendations useful for further investigations. In consideration of the spreading of the pandemic in 2020, social media represent an important field of analysis since much of the public debate migrated to online platforms. This offers researchers a lot of data and content that, even if ephemeral, are able to capture contradictory discursive dynamics that develop on social media. Analysing this data through Digital Methods, implies difficulties and critical issues both from a methodological and ethical point of view. From a methodological point of view, the analysis presented shows some limitations particularly linked with the volume of data processable by the tool TCAT. Users tool can access just a small volume of Twitter content according to the limitations imposed by the social media platform. In general, the main limitation could be that of focusing on one specific environment -Twitter -with his own bias and constraints affecting the spreading of contents and messages. A future investigation may consider multiple platforms -such as Facebook and Instagram which still exploit the hashtags -in order to analyse if the spreading of different conspiracies would follow the same pattern or if it varied significantly.

Funding:
The author received no financial support for the research and/or authorship of this article.

Declaration of interest: Author declares no competing interest.
Data availability: Data generated or analysed during this study are available from the author on request.