Proceedings of the 34th Annual Hawaii International Conference on System Sciences
Download PDF

Abstract

Currently, a large fraction of users accesses network resources through web clients/browsers. The majority of today's Web services are generally not concerned about the level of Quality of Service (QoS) presented to their users. However, there exists a small but increasing number of web sites that need to maintain their popularity and reputation, and are concerned about the QoS experienced by their users. Furthermore, new web applications demand delivery of multimedia data in real time (e.g. streaming stored video and audio), and the information transfer via the Internet, is becoming one of the principal paradigm for business: electronic sales, banking, finance, collaborative work, are few examples of this. The QoS perceived by its users is thus becoming a dominant factor for the success of an Internet based Web service. The principal QoS attributes these users perceive include those related to the “responsiveness” of the service, i.e. the service availability and timeliness. A service that is frequently unavailable may have the effect of tarnishing the reputation of the service provider or result in loss of opportunity. Furthermore, from the user's perspective, a service that exhibits poor responsiveness is virtually equivalent to an unavailable service.The performance perceived by the users of a Web service depends on the performance of the protocols that operate between web clients and servers. Solutions currently investigated in the literature range from new protocols that enhance the network QoS (e.g. differentiated services) to mechanisms that run on the end systems and enhance the QoS perceived by the users (e.g. caching, prefetching, Web servers' replications, etc.). Currently an IP network provides only a best effort service. While, in the medium-term, it is envisaged that Internet evolves to a universal transport service that will be able to provide (when it is required) a delivery of the traffic with QoS guarantees, in the short term the middleware must cope with the insufficient bandwidth and high latency.The minitrack focuses on the design and analysis of middleware techniques and protocols for providing Web services with QoS guarantees by addressing the following issues: i) document retrieval latency, ii) data availability iii) amount of data to transfer, and iv) redistributing the network accesses to avoid network congestion.Cardellini, Casalicchio and Colajanni focus on architectures that adopt servers' replication to construct Web services with QoS guarantees. Specifically, they compare and contrast local replication and geographical replication of Web servers. The comparison is performed by considering three scenarios: Web publishing sites with static pages, Web publishing sites with dynamic pages, and e-commerce sites with a large percentage of secure requests.Ghini, Panzieri and Roccetti present and evaluate a novel approach for providing a geographically replicated Web service. Specifically, in their approach all the Web replicas contribute to solve a query depending on the current data rate they can provide. Empirical results indicate that the proposed approach can be effective to improve the user response time with respect to the classical HTTP approach.Cherkasova, DeSouza and Ponnekanti focus on local replication of Web servers. They analyze a low cost scalable load balancing strategy called FLEX. In the FLEX approach, the main idea is to distribute memory and processor loads among all the nodes of a cluster. Their simulation results indicate that FLEX can be effectively used for providing QoS in shared web browsing systems liable to changes in traffic patterns.Cherkasova and Phaal investigate an important aspect of the QoS for commercial Web servers: all requests related to a particular session must be successfully completed. To this end, they propose control strategies to optimize the performance of Session-based admission control. Simulation results demonstrate the superiority of their strategies over those of others.Lau, Kumar and Venkatesh investigate the potential of caching as a means to improve quality of reception (QoR) in the context of continuous media applications. Their paper addresses the problem of web caching by employing a strategy that captures the effects of a large number of underlying network dynamics. The improvised Greedy Dual algorithm based on a flexible cost function is shown to outperform existing methods. Acknowledgements: We wish to take this opportunity to thank all the authors for their submissions and all the referees for the time and effort they spent to review each paper in a professional way.

Related Articles