439 research outputs found
Quantifying Biases in Online Information Exposure
Our consumption of online information is mediated by filtering, ranking, and
recommendation algorithms that introduce unintentional biases as they attempt
to deliver relevant and engaging content. It has been suggested that our
reliance on online technologies such as search engines and social media may
limit exposure to diverse points of view and make us vulnerable to manipulation
by disinformation. In this paper, we mine a massive dataset of Web traffic to
quantify two kinds of bias: (i) homogeneity bias, which is the tendency to
consume content from a narrow set of information sources, and (ii) popularity
bias, which is the selective exposure to content from top sites. Our analysis
reveals different bias levels across several widely used Web platforms. Search
exposes users to a diverse set of sources, while social media traffic tends to
exhibit high popularity and homogeneity bias. When we focus our analysis on
traffic to news sites, we find higher levels of popularity bias, with smaller
differences across applications. Overall, our results quantify the extent to
which our choices of online systems confine us inside "social bubbles."Comment: 25 pages, 10 figures, to appear in the Journal of the Association for
Information Science and Technology (JASIST
Measuring Online Social Bubbles
Social media have quickly become a prevalent channel to access information,
spread ideas, and influence opinions. However, it has been suggested that
social and algorithmic filtering may cause exposure to less diverse points of
view, and even foster polarization and misinformation. Here we explore and
validate this hypothesis quantitatively for the first time, at the collective
and individual levels, by mining three massive datasets of web traffic, search
logs, and Twitter posts. Our analysis shows that collectively, people access
information from a significantly narrower spectrum of sources through social
media and email, compared to search. The significance of this finding for
individual exposure is revealed by investigating the relationship between the
diversity of information sources experienced by users at the collective and
individual level. There is a strong correlation between collective and
individual diversity, supporting the notion that when we use social media we
find ourselves inside "social bubbles". Our results could lead to a deeper
understanding of how technology biases our exposure to new information
Quality of Service System Approximation in IP Networks
This paper is sponsored by the Ministry of Education and Research of the Republic of
Bulgaria in the framework of project No 105 “Multimedia Packet Switching Networks Planning with Quality of Service and Traffic Management”.This paper presents Quality of Service analyses in wired and
wireless IP networks based on the three popular techniques – RSVP, IntServ,
and DiffServ. The analyses are based on a quick approximation schema of
the traffic system with static and dynamic changes of the system bounds.
We offer a simulation approach where a typical leaky bucket model is ap-
proximated with a G/D/1/k traffic system with flexible bounds in waiting
time, loss and priority. The approach is applied for two cascaded leaky
buckets. The derived traffic system is programmed in C++. The simula-
tion model is flexible to the dynamic traffic changes and priorities. Student
criterion is applied in the simulation program to prove results. The results
of the simulation demonstrate the viability of the proposed solution and its
applicability for fast system reconfiguration in dynamic environmental circumstances. The simulated services cover a typical range of types of traffic
sources like VoIP, LAN emulation and transaction exchange
VoIP Traffic Shaping Analyses in Metropolitan Area Networks
This paper represents VoIP shaping analyses in devices that apply the three Quality of Service
techniques – IntServ, DiffServ and RSVP. The results show queue management and packet stream shaping
based on simulation of the three mostly demanded services – VoIP, LAN emulation and transaction exchange.
Special attention is paid to the VoIP as the most demanding service for real time communication
LEVERAGING YARA AND SIGMA RULES TO DETECT CHINESE STATE-SPONSORED HACKING GROUPS OF THE "TYPHOON" TYPE
This study addresses the escalating cyber threat posed by Chinese state-sponsored hacking groups, particularly the "Typhoon" class (Salt Typhoon and Volt Typhoon) which target critical infrastructure through stealthy and persistent techniques. The research aims to enhance detection capabilities against these advanced persistent threats by analysing their tactics, techniques, and procedures and by developing YARA and Sigma rules. The methodology involves mapping observed TTPs to MITRE ATT&CK and designing detection rules that identify key indicators of compromise in both system files and event logs. The main contribution of the study is the implementation of rule-based detection mechanisms that proactively uncover malicious activities often missed by traditional signature-based tools
Fault Tolerance for Real-Time Systems: Analysis and Optimization of Roll-back Recovery with Checkpointing
Increasing soft error rates in recent semiconductor technologies enforce the usage of fault tolerance. While fault tolerance enables correct operation in the presence of soft errors, it usually introduces a time overhead. The time overhead is particularly important for a group of computer systems referred to as real-time systems (RTSs) where correct operation is defined as producing the correct result of a computation while satisfying given time constraints (deadlines). Depending on the consequences when the deadlines are violated, RTSs are classified into soft and hard RTSs. While violating deadlines in soft RTSs usually results in some performance degradation, violating deadlines in hard RTSs results in catastrophic consequences. To determine if deadlines are met, RTSs are analyzed with respect to average execution time (AET) and worst case execution time (WCET), where AET is used for soft RTSs, and WCET is used for hard RTSs. When fault tolerance is employed in both soft and hard RTSs, the time overhead caused due to usage of fault tolerance may be the reason that deadlines in RTSs are violated. Therefore, there is a need to optimize the usage of fault tolerance in RTSs. To enable correct operation of RTSs in the presence of soft errors, in this thesis we consider a fault tolerance technique, Roll-back Recovery with Checkpointing (RRC), that efficiently copes with soft errors. The major drawback of RRC is that it introduces a time overhead which depends on the number of checkpoints that are used in RRC. Depending on how the checkpoints are distributed throughout the execution of the job, we consider the two checkpointing schemes: equidistant checkpointing, where the checkpoints are evenly distributed, and non-equidistant checkpointing, where the checkpoints are not evenly distributed. The goal of this thesis is to provide an optimization framework for RRC when used in RTSs while considering different optimization objectives which are important for RTSs. The purpose of such an optimization framework is to assist the designer of an RTS during the early design stage, when the designer needs to explore different fault tolerance techniques, and choose a particular fault tolerance technique that meets the specification requirements for the RTS that is to be implemented. By using the optimization framework presented in this thesis, the designer of an RTS can acquire knowledge if RRC is a suitable fault tolerance technique for the RTS which needs to be implemented. The proposed optimization framework includes the following optimization objectives. For soft RTSs, we consider optimization of RRC with respect to AET. For the case of equidistant checkpointing, the optimization framework provides the optimal number of checkpoints resulting in the minimal AET. For non-equidistant checkpointing, the optimization framework provides two adaptive techniques that estimate the probability of errors and adjust the checkpointing scheme (the number of checkpoints over time) with the goal to minimize the AET. While for soft RTSs analyses based on AET are sufficient, for hard RTSs it is more important to maximize the probability that deadlines are met. To evaluate to what extent a deadline is met, in this thesis we have used the statistical concept Level of Confidence (LoC). The LoC with respect to a given deadline defines the probability that a job (or a set of jobs) completes before the given deadline. As a metric, LoC is equally applicable for soft and hard RTSs. However, as an optimization objective LoC is used in hard RTSs. Therefore, for hard RTSs, we consider optimization of RRC with respect to LoC. For equidistant checkpointing, the optimization framework provides (1) for a single job, the optimal number of checkpoints resulting in the maximal LoC with respect to a given deadline, and (2) for a set of jobs running in a sequence and a global deadline, the optimization framework provides the number of checkpoints that should be assigned to each job such that the LoC with respect to the global deadline is maximized. For non-equidistant checkpointing, the optimization framework provides how a given number of checkpoints should be distributed such that the LoC with respect to a given deadline is maximized. Since the specification of an RTS may have a reliability requirement such that all deadlines need to be met with some probability, in this thesis we have introduced the concept Guaranteed Completion Time which refers to a completion time such that the probability that a job completes within this time is at least equal to a given reliability requirement. The optimization framework includes Guaranteed Completion Time as an optimization objective, and with respect to the Guaranteed Completion Time, the framework provides the optimal number of checkpoints, while assuming equidistant checkpointing, that results in the minimal Guaranteed Completion Time
Correlation between pain and muscle strength in patients with adult scoliosis
INTRODUCTION: In this study, we investigate the possible benefits of two different combinations of exercise-based therapies in the treatment of adult idiopathic scoliosis.MATERIALS AND METHODS: A total of 62 patients (mean age 31.43 years) were selected to participate in the physiotherapy protocol. Pain and trunk muscle strength in patients with scoliosis (mean 10.93°Cobb) were measured in each subject prior to treatment intervention and 6 months following the intervention. The level of pain and trunk muscle strength was analyzed on each test, so pre- and post-comparisons could be made.RESULTS AND CONCLUSION: After 6 months of treatment, the experimental group averaged a 2.75 reduction in their levels of pain according to the visual analogue scale (VAS), and the control group averaged 1.88 reduction in their levels of pain. None of the patients had an increase of pain. The trunk muscle strength increased in both groups. The combined use of spinal mobilization and postural therapy appeared to significantly reduce the levels of pain and increase trunk muscle strength in all 62 subjects. These results warrant further testing of this protocol
Right and left, partisanship predicts (asymmetric) vulnerability to misinformation
We analyze the relationship between partisanship, echo chambers, and
vulnerability to online misinformation by studying news sharing behavior on
Twitter. While our results confirm prior findings that online misinformation
sharing is strongly correlated with right-leaning partisanship, we also uncover
a similar, though weaker trend among left-leaning users. Because of the
correlation between a user's partisanship and their position within a partisan
echo chamber, these types of influence are confounded. To disentangle their
effects, we perform a regression analysis and find that vulnerability to
misinformation is most strongly influenced by partisanship for both left- and
right-leaning users
- …
