<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <title>OAR@UM Community:</title>
  <link rel="alternate" href="https://www.um.edu.mt/library/oar/handle/123456789/14102" />
  <subtitle />
  <id>https://www.um.edu.mt/library/oar/handle/123456789/14102</id>
  <updated>2026-04-14T13:15:43Z</updated>
  <dc:date>2026-04-14T13:15:43Z</dc:date>
  <entry>
    <title>Unified load balancing strategies for enhanced cloud computing solutions</title>
    <link rel="alternate" href="https://www.um.edu.mt/library/oar/handle/123456789/145526" />
    <author>
      <name />
    </author>
    <id>https://www.um.edu.mt/library/oar/handle/123456789/145526</id>
    <updated>2026-04-14T12:34:47Z</updated>
    <published>2025-01-01T00:00:00Z</published>
    <summary type="text">Title: Unified load balancing strategies for enhanced cloud computing solutions
Abstract: Cloud computing offers scalable, on-demand resources that enable a variety of services and applications. Effective load balancing in cloud environments is essential for maintaining performance and Quality of Service (QoS). These environments present complex, dynamic conditions that make efficient load balancing challenging. Many existing algorithms focus on single-objective optimisation, such as minimising response time, which often results in trade-offs and inefficiencies when dealing with unpredictable workloads. This dissertation tackles these inefficiencies by introducing a unified, multi-objective load balancing strategy that combines Ant Colony Optimisation (ACO) and Genetic Algorithm (GA) techniques. The hybrid ACO-GA algorithm is implemented within the CloudAnalyst simulation environment, leveraging ACO’s rapid local search and GA’s global exploration capabilities to dynamically balance workloads across cloud resources. Extensive simulation experiments demonstrate that the proposed hybrid approach significantly improves key QoS metrics compared to both conventional and state-of-the-art load balancers. The ACO-GA consistently achieved substantially lower average response times and improved load distribution relative to traditional algorithms. For example, under light workloads it reduced mean response time by roughly 50% versus Round Robin and 40% under heavy loads. The hybrid method also outperformed modern heuristics, sustaining about 8–10% faster response than advanced metaheuristic policies while shortening data centre processing delays. These gains were accompanied by more efficient resource utilisation, as the algorithm prevented server overloading and underutilisation through balanced task allocation. Notably, performance improvements persisted across both low and high demand scenarios, highlighting the algorithm’s robust adaptability to dynamic cloud conditions. Overall, the results affirm that this unified ACO-GA strategy effectively addresses the limitations of single-objective approaches, offering a significant enhancement in cloud service performance, resource utilisation and QoS.
Description: M.Sc. ICT(Melit.)</summary>
    <dc:date>2025-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>A framework to support test tool design and acquisition</title>
    <link rel="alternate" href="https://www.um.edu.mt/library/oar/handle/123456789/144337" />
    <author>
      <name />
    </author>
    <id>https://www.um.edu.mt/library/oar/handle/123456789/144337</id>
    <updated>2026-02-26T13:40:14Z</updated>
    <published>2026-01-01T00:00:00Z</published>
    <summary type="text">Title: A framework to support test tool design and acquisition
Abstract: Software testing is an important facet of software delivery, supported by tools&#xD;
intended to improve the efficiency and effectiveness of testing. Industry experience&#xD;
and academic research show that tool adoption can be problematic; tools are acquired&#xD;
but not used, or are used but do not deliver.&#xD;
The research problem this thesis addresses is how to design tools that better&#xD;
match the needs of testers to operate in an increasingly complex socio‐technical&#xD;
environment. Industry practitioners’ and experts’ experiences with tools were&#xD;
explored, through in‐depth interviews, workshops and surveys. It was found that&#xD;
testers experienced frustrations arising from tools which, while offering attractive&#xD;
interfaces, did not provide quality in use necessary to meet testers’ needs. In this work,&#xD;
this is referred to as the ‘illusion of usability’. This illusion arises from a superficial&#xD;
understanding of usability as being focused on the user interface, working with a&#xD;
limited persona set, and focusing narrowly on usability, without considering the other&#xD;
attributes that make up quality in use.&#xD;
Furthermore, finding that testers do not conform to the stereotype of IT&#xD;
workers, and cannot be represented in tool design by a simple, small set of personas or&#xD;
archetypes, it was decided to apply an HCI lens to the problem, with the research&#xD;
question “How can HCI techniques help with the design of test tools?” In answering&#xD;
this question, this work proposes an empirically grounded framework (idea‐t), which&#xD;
supports decision making in both design and acquisition of tools through a set of&#xD;
heuristics, guidelines and activities.&#xD;
The idea‐t framework (“Influencing the Design, Evaluation and Acquisition of&#xD;
Tools for Testing”) emerged following a series of studies and was iteratively reviewed&#xD;
and validated through five industry case studies. Learning was carried forward from&#xD;
each case study and applied to the framework. The five formative case studies&#xD;
iteratively informed the development of the framework, while also providing evidence&#xD;
of its effectiveness in the process. Participants reported benefits including new&#xD;
insights and improved communication within their teams. A final retrospective analysis&#xD;
evaluated the framework by examining a backlog of customer issues raised on a&#xD;
commercial tool; it was found that potentially 40% of issues could have been mitigated&#xD;
by the idea‐t framework. Expert reviews were also carried out to assess the latest&#xD;
version of the framework, where experts from testing, test tool development, and HCI&#xD;
provided positive feedback on the framework’s efficacy, and suggestions for its&#xD;
practical application.
Description: Ph.D.(Melit.)</summary>
    <dc:date>2026-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>Causal consistency through a novel distributed middleware over strongly consistent transaction processing</title>
    <link rel="alternate" href="https://www.um.edu.mt/library/oar/handle/123456789/135067" />
    <author>
      <name />
    </author>
    <id>https://www.um.edu.mt/library/oar/handle/123456789/135067</id>
    <updated>2025-12-12T06:54:21Z</updated>
    <published>2023-01-01T00:00:00Z</published>
    <summary type="text">Title: Causal consistency through a novel distributed middleware over strongly consistent transaction processing
Abstract: Our research deals with the concept of causal consistency of data in the context&#xD;
&#xD;
of transactional information systems with scalability and high availability require-&#xD;
ments. We deal with the consistency of data which is stored and replicated in&#xD;
&#xD;
multiple physical locations. Given the data store’s distributed nature, a new set&#xD;
of data inconsistency issues arise. These cause clients to get an inconsistent, and&#xD;
therefore possibly incorrect, view of the data, yielding application errors and even&#xD;
susceptibility to security vulnerabilities. Most problems do not impact centralised&#xD;
databases, but centralised databases do not provide the resiliency and performance&#xD;
characteristics required by modern enterprise transactional information systems.&#xD;
We focus on this set of data inconsistency problems, and propose solutions to&#xD;
strengthen consistency guarantees without jeopardising the benefits of a distributed&#xD;
database. We model causal consistency, the strongest type of consistency that can&#xD;
&#xD;
be implemented in fault-tolerant, scalable databases, using the Actor model of com-&#xD;
putation. The model is then implemented on top of commercially-ready relational&#xD;
&#xD;
database management systems that are built to provide strong consistency.&#xD;
Data Inconsistency, Transaction Inconsistency and Integrity Invariant Violation&#xD;
&#xD;
are three related, but distinct, problems tackled in this research. For each prob-&#xD;
lem, we review the literature as well as design, implement and evaluate a novel&#xD;
&#xD;
solution. Our work shows that it is possible to have a distributed middleware that&#xD;
implements causal consistency with transaction consistency and integrity invariant&#xD;
&#xD;
preservation over a set of disconnected relational databases deployed within geo-&#xD;
graphically distributed data centres. Thus, our approach addresses each problem&#xD;
&#xD;
whilst answering to the scalability and resiliency needs of modern systems.&#xD;
Empirical results show that our middleware achieves better performance when&#xD;
compared to a single-node (i.e., non-distributed) relational database management&#xD;
&#xD;
system. We also extend our solution for Data Inconsistency and deploy the middle-&#xD;
ware on many machines within a data centre. In doing so, we identify and propose&#xD;
&#xD;
solutions for the complexities that arise from scaling the middleware horizontally,&#xD;
whilst our benchmarks show a significant increase in the amount of operations that&#xD;
can be processed at each data centre, and that data changes are replicated across&#xD;
geographically distributed instances of the system within acceptable timeframes.
Description: Ph.D.(Melit.)</summary>
    <dc:date>2023-01-01T00:00:00Z</dc:date>
  </entry>
  <entry>
    <title>A blockchain-based framework and process guide for intelligent exchange and use of health information in low resource environments</title>
    <link rel="alternate" href="https://www.um.edu.mt/library/oar/handle/123456789/129597" />
    <author>
      <name />
    </author>
    <id>https://www.um.edu.mt/library/oar/handle/123456789/129597</id>
    <updated>2024-12-05T09:30:19Z</updated>
    <published>2024-01-01T00:00:00Z</published>
    <summary type="text">Title: A blockchain-based framework and process guide for intelligent exchange and use of health information in low resource environments
Abstract: Enterprise software systems integration can be simple or complicated depending on the number of components and predictability of component interactions. Designing enterprise software systems with many adaptive components that learn as they interact is not easy. Software systems are often designed as function-specific systems that mimic user concerns modeled around organizational structure and communication patterns. Even a single and simple enterprise now has multiple integrated applications. Traditional integration styles are file sharing, shared databases, remote procedure calls, and messaging. These traditional approaches often require a trusted and centralized access-issuing database owner for multi-stakeholder enterprise systems. In the last decade, a new trustless software integration pattern has been facilitated by blockchain. Enterprise blockchain frameworks have been developed, yet practical use cases are few. Use cases still require domain data standardization, token modeling, and interface for regulatory intervention while preserving participants' privacy. This Thesis investigates enterprise integration using the Health Information Exchange (HIE) use case whose value proposition to healthcare stakeholders&#xD;
is well established. Governments, software vendors, non-profits, and private players traditionally perform the central HIE intermediation role. These intermediation efforts faced many bottlenecks. One main challenge is exchanging large numbers of structured, unstructured, and standardized datasets and terminology sets. This complexity has, over the years, resulted in many healthcare data standards.
Description: Ph.D.(Melit.)</summary>
    <dc:date>2024-01-01T00:00:00Z</dc:date>
  </entry>
</feed>

