Project description

InfoTimes team has investigated the last conflicting wave between the supporters and opponents on Twitter by analyzing the two conflicting hashtags “Erhal_Ya_Sisi” (Sisi, Leave) and “El-Sisi_Mesh_Hayerhal” (El-Sisi will not leave). Info Times has gathered a sample of 11,268 opposing tweets (re-tweet, reply to tweet and Mention) and 2,570 of supporting tweets during June. Then, this sample of tweets was analyzed as for quantity and quality in order to find out the nature of the accounts which launched these hashtags and led them to be in the top trending list.

What makes this project innovative?

Recently, social media have been used in a different way; instead of using them to evade the reality by chatting, exchanging comics and funny pictures, they have been used in politics conflicts. Such attitude drew our attention especially in the last June and July when hashtag campaign broke out and occupied the top of “Trend List” on Twitter. Such hashtags reflected a conflict case between the opponents of The Egyptian President, Abd El Fattah El Sisi and his supporters where both sides published a number of hashtags of opposite meanings and different ways of writing or formulation. The hashtag campaign has continued for several days where each side tried to draw attention of the largest number of “twitter” users to support his hashtag. Because the political conflict moved to “twitter” site, we began to follow up the issue through focusing on the recent wave, monitoring and analyzing the two big blocs which were formed under two different hashtags; “GO SISI” and “SISI WILL NOT GO.”

What was the impact of your project? How did you measure it?

60.000 page views till now. 5 bots mentioned in the report stopped during the first week of publishing the story.

Source and methodology

We have learned great wisdom in such issue; when a tool didn’t meet our objective and confused us, we determined many standards to judge accounts’ identity concerning account’s date of creation, time link of the account between the promoted hashtag, and attitude of account’s incoming tweeting. Sometimes we find some accounts republish tweets that are against their attitudes “supporting- opposing” just because they used the hashtag which represents account’s general attitude. It means that such accounts are programmed to republish tweets that contain a specified hashtag regardless tweets concept, whether the hashtag backs the account’s political attitude or not; in addition to account’s name whether normal name or symbol-compound one, the used picture, the ratio between number of who are watched by the account to the number of account observers; it is illogic to find thousands of observers to a certain account, which in turn that account just observe one or two accounts. After a large number of filtrations, classification, and tracking, we found out some programmed accounts “bots” of effective activities in such event with its two attitudes “opposing- supporting;” such activities were dangerous because the programmed accounts reduce the chance of running real credible discussion on social media via the internet.

Technologies Used

During the 2-month period of inquiry, analysis and preparation of story, our team was able through June to collect a sample of 11,268 tweets that represented the opposing hashtag containing “retweet- reply to tweet- directed tweet” and another one of 2,570 tweets that represented the supporting hashtag using a specific code written by “R” programming language. Such code helped us get tabulated data that contained time and date of publishing of each tweet, written text, media file links “images- videos,” number of actions per each tweet and name of tweeting account. The collected data formed two big blocs; each one contained small blocs of different sizes. We needed to split such blocs to know their accounts and whether the accounts are personal or programmed “bots,” therefore, we used the “Botometer” tool, which is a free tool developed by “Network Sciences Institute” of Indiana University. Such tool helped us know whether the account is programmed “bots” through evaluating it using a variety of standards to get a total average value between zero and 5; If the value approaches zero, it is a personal account, If the value approaches 5, it is probably programmed “bots.”

Project members

Amr Eleraqi is the executive director of InfoTimes. With 12 years of experience in the media industry, he is a pioneer in data journalism and data storytelling in the region. Kanishk Karan is an Indian data journalist who focuses on technology, privacy, internet culture, extremism and immigration. He studied at Columbia Journalism School and works for the American Atlantic Council research center. Islam Salahuddin is Junior Digital Journalist, with special focus on data-driven storytelling.

Video

Link

Followers

Click Follow to keep up with the evolution of this project:
you will receive a notification anytime the project leader updates the project page.