ASIAN'02, the 7th Asian Computing
Science Conference, Dec. 4-6, 2002, Hanoi, Vietnam.
The popular success of the Internet is due in large part to the World Wide
Web. The Web was initially designed as a system for the storage and retrieval
of distributed data.
We are witnessing an evolution of the services supported by the Global Network.
More and more, applications run over the net will consist in complex data
search and retrieval (electronic commerce...), and high performance computing
tasks, for the Laboratory user (physics, biology, ...), the Industrial user
(aeronautics, pharmaceutics, ...) as well as for the end user (weather services,
stock market, ... ).
For the middleware and application developers, Internet is now more and more
seen as a programming environment for data mining, data processing, cooperative
work etc. This has lead to "Internet Computing", a fast growing
scientific area, encompassing distributed computing and networking.
Together with the associated concepts of Grids and Peer-to-Peer exchanges,
Internet Computing is the source of several scientific and technological challenges.
Among those: the efficient and fair sharing of the globalized resources, and
the total yet transparent security of exchanges of data.
Internet Performance An introduction to Performance
evaluation with application to communication networks, by: Prof. Alain
Jean-Marie, University of Montpellier 2, France.
Monday december 2nd, 2002, 9:00 17:00, Asian Institue of Technology
Registration Fee: 3,000 THB
Registration Deadline : November 20th, 2002
Enquiries: Patcharee Basu ,+66 2524
This tutorial is an introduction to the field of Performance Evaluation, with
applications to problems concerning the Quality of Service in the Internet,
and communication networks in general. The tutorial will first review the
basic mathematical tools and results used in the modeling of telecommunication
systems: stochastic processes, Markov Chains, and the theory of queues. Then,
as illustrations and applications, we will detail models of several algorithms
or protocols used in networks: routing algorithms; flow control mechanisms,
including TCP; mechanisms for service differentiation, including IntServ and
DiffServ. Tentative list of topics:
Introduction: methodology of performance evaluation in networks
The basics of random processes
The theory of queues: description, measures of performance, general
Stochastic models of network traffic
Application 1: capacity and route planning in networks
Application 2: estimating loss probabilities in networks
Application 3: analysis of the differential service in the Internet
Application 4: analysis of flow control mechanisms in networks
Application 5: calculation of performance guarantees with the Network
About the Speaker:
Alain Jean-Marie received the Maîtrise Es-Sciences in Mathematics, and
the Doctorat de 3e cycle, from the University of Paris XI at Orsay, in 1985
and 1987 respectively. During 1988 89, he was a Post-Doctoral fellow
at the University of Maryland. From 1987 to 1999, he has been a member of
the performance evaluation group at INRIA, Sophia Antipolis, France. He is
now professor at the University of Montpellier, France. His research interests
include modeling and performance evaluation for real-time systems and communication
networks, control theory, analysis of algorithms and parallel computing.
Development of Information and Communication Technology
Curricula: Some Experience, by Professor Finn Arve Aagesen, Department
of Telematics, Norwegian University of Science and Technology.
Friday Nov. 15, 2002, 10:00 11:00, CSIM #209
Information and Communication Technology (ICT) has emerged as a new technological
area and a new field of study as a result of the convergence of computer and
communications technologies. Many universities including the Norwegian University
of Science and Technology (NTNU) have started re-organizing their existing
traditional related departments in order to accommodate this new field of
study. Traditional departments that could contribute to this new field include
communications engineering, computer science/engineering, electronics engineering
and mathematics. This presentation will explain how NTNU has re-organized
its faculty and department structure as well as how it has developed and improved
its ICT curriculum during the last three years. Details of courses will also
WebVigil: A Rule-Based Approach to Change Detection
in Web Environments,by: Prof.
Sharma Chakravarthy, Computer Science and Engineering Department,
the University of Texas at Arlington, Arlington, USA.
Friday, 2 August 2002, 11:00-12:00, CSIM #209
Active capability introduces a new paradigm for situation monitoring be it
for inventory control, intelligent push, E-Commerce, or for event-driven distributed
applications. Over the last several years there has been a lot of work in
developing event specification languages and event detection algorithms (both
centralized and distributed), incorporating active capability using several
different approaches, and the development of design and visualization tools.
In this project, we apply the ECA paradigm for change detection in large network-centric
environments. This project investigates timely detection and propagation of
changes to arbitrary documents. Details of specification and the architecture
will be described in detail. The use of ECA rules and the reuse of components
developed in the Sentinel project will be highlighted.
Finally, we discuss change detection to XML and html documents and the use
of remote and local wrappers. We will highlight other projects being carried
Computational Grids: New Challenges for Distributed
Computing, by Prof. Yakup Paker,
Department of Computer Science, Queen Mary, University of London.
Friday, March 8, 2002, 13:00-14:00, CSIM #209
Clustering of a wide variety of geographically distributed resources, such
as supercomputers, storage systems, data sources, special facilities, etc.,
as a unified resource has led to the concept of "Computational Grids".
This is analogous to the electrical power grid that provides power to consumers
by simple electrical connections, irrespective where the power generators
are located or their type. The computational grid technology thus attempts
to provide users transparent access to the entire set of resources connected.
Thus the computational grid needs to cope with aspects such as authentication,
name space, resource management, scheduling, accounting, etc. At the same
time, a number of problems in modern distributed computing are being addressed
under the area broadly referred as peer-to-peer computing. Under this heading
a wide range of technologies are developed to increase the utilisation of
information, bandwidth and computing resources in the Internet. This lecture
investigates the type of problems posed by the computational grids and to
what extend those overlap with or differ from the concerns of peer-to-peer
ICANN: A Bird's Eye View, by M. Stuart Lynn,
ICANN President and CEO.
Thursday March 7, 2002, 13:00-14:30, Bender Auditorium
The Internet Corporation for Assigned Names and Numbers (ICANN) is a small,
not-for-profit company which was formed to assume responsibility for the IP
address space allocation, protocol parameter assignment, domain name system
anagement, and root server system management functions previously performed
under U.S. Government contract by IANA and other entities.
Stuart Lynn will talk about his experience as being the President and CEO
of an organization which was formed by a bottom-up process with global participation.
Eiffel: state of the art and current developments,
by Prof. Bertrand Meyer, ETH (Zürich) and ISE (Santa Barbara).
Monday 4 February 2002, 13:30-15:00, Milton Bender Auditorium, ITServ and
Eiffel is a method and language for object-oriented software development,
supported by numerous libraries and several development environments such
as ISE Eiffel. This presentation will describe the state of applications of
Eiffel in industry and academia, describe the concepts of Eiffel, in particular
the method of "Design by Contract" which is central to the approach,
present the development environment, and describe recent developments in Eiffel
Bertrand Meyer is Professor of Software Engineering at ETH Zürich (the
Siwss Federal Institute of Technology) and scientific advisor and co-founder
of ISE, the originator of Eiffel. He is the author of several well-known books
on software engineering and object technology, in particular "Object-Oriented
Software Construction, 2nd edition" (1998 Jolt Award), "Eiffel:
The Language", "Object Success" and "Introduction to the
Theory of Programming Languages".
An Invitation to Computational Topology and Curve Reconstruction,
by Dr. Sumanta Guha, Associate Professor, Electrical Engineering &
Computer Science Dept., University of Wisconsin-Milwaukee, Milwaukee, USA.
Wednesday 16 January 2002, 9:00, CSIM #209
This is a two-part talk. In the first part I will present results of work
done in the emerging field of computational topology. We analyze the structure
of complexes (objects that have a triangulation) in the real world R^3 and
give algorithms to determine certain properties of these objects that remain
invariant under "homotopic" transformation. A homotopy is essentially
a "plastic" transformation. Eg., the letter "a" is homotopic
to the letter "o" - one can be "squeezed" into the other.
The topological property that their shapes share is having one "tunnel"
through them. Similarly, "B" is homotopic to "8". We give
algorithms to compute the number of tunnels through and voids inside a 3D-complex
- quantities, called Betti numbers, that remain unchanged by plastic transformation.
The talk, including many visual examples, is intended to convey to a general
CS audience the flavor of an interesting new discipline that studies shape
The second part of the talk is of a more applied nature and based upon on-going
work. The problem we deal with is the following: given a sample of points
from some unknown curve attempt to reconstruct the original curve. This problem,
as well as its twin problem of surface reconstruction (reconstruct an unknown
surface from a sample of points), has many practical applications - in domains
such as medical imaging and terrain analysis where sample data is obtained
from a digitized scanner. Both the curve and surface reconstruction problems
have been of great recent interest in the computational geometry community
and a set of algorithms has developed, all based on Delaunay triangulation.
We propose a new paradigm to reconstruct curves based on bounding curvature
and detecting monotone pieces. Our method is computationally more efficient
than Delaunay-based methods and preliminary implementations indicate comparable
output. The paradigm potentially extends to surface reconstruction. In the
talk I will try to show the main underlying ideas and implementation methodology.