Béat Hirsbrunner - Academia.edu (original) (raw)
Papers by Béat Hirsbrunner
This paper presents an implementation of the "form template method" refactoring. This transformat... more This paper presents an implementation of the "form template method" refactoring. This transformation has not been automated yet, but has many similarities with other transformations such as clone detection and removal or method extraction. Forming a template method is a difficult process because it has to deal with code statements directly. Few abstractions and algorithms have been investigated yet, compared to transformations dealing with higher level aspects such as the classes, methods, fields and their relations. We present a new algorithm that performs this transformation in a semi-automated way on Java programs. We state the difficulties inherent to this transformation and propose solutions to handle them.
The evolution of mobile communication systems to 3G and beyond introduces requirements for flexib... more The evolution of mobile communication systems to 3G and beyond introduces requirements for flexible, customized, and ubiquitous multimedia service provision to mobile users. One must be able to know at any given time the network status, the user location, the profiles of the various entities (users, terminals, network equipment, services) involved and the policies that are employed within the system. Namely, the system must be able to cope with a large amount of context information. Present paper focuses on location and context awareness in mobile service provisioning and proposes a flexible and innovative model for user profiling. The innovation is based on the enrichment of common user profiling architectures to include location and other contextual attributes, so that enhanced adaptability and personalization can be achieved. For each location and context instance an associated User Profile instance is created and hence, service provisioning is adapted to the User Profile instance that better apply to the current context. The generic model, the structure and the content of this location-and contextsensitive User Profile, along with some related implementation issues, are discussed.
Computing in Cardiology Conference, Sep 1, 2013
Aims: In the focus of this study stand the fibroblast cells that under physiological terms are pr... more Aims: In the focus of this study stand the fibroblast cells that under physiological terms are providing structural support for the heart, but under pathophysiological conditions they can obstruct the pacemaker activity and the excitation spread function of the heart that may develop arrhythmia. Methods: We investigated the influence of high fibroblast level under several patho-physiological conditions. The simulation was performed on a 3D heart model, adopting a 0.25 mm spatial and 2 μs temporal maximal resolution. In our simulation 25% volume of the normal cardiac tissue is occupied by fibroblast, and in presence of pathological cases or aging, the fibroblast cells accumulate up to 40%-95% volume. We employed the effect of cardio-myocyte death and laminar sheets. Results: In presence of 30%/35%/40%/45%/50% fibroblast in cardiac myocyte, the spread velocity of depolarization was obstructed by 3%/7%/11%/15%/19%, while regional inflammation and injures locally reduced the propagation speed of excitation by at least 20%. Tissue aging has reduced cardiac pacemaker activity and increased the possibility of irregular cardiac activity. Conclusion: High fibroblast levels not only detains significantly the spread of excitation, but it can obstruct the depolarization wave evolving cardiac arrhythmia.
Peer-to-peer (P2P) networks and overlays have gained wide acceptance over the last years, attribu... more Peer-to-peer (P2P) networks and overlays have gained wide acceptance over the last years, attributed mainly to their inherent support for handling concerns such as scalability, dynamicity and heterogeneity that are commonly found in current networking infrastructures. In this respect, peer-to-peer overlay approaches serve as networking abstraction layers, thus scaling down the degree of complexity and leading to more manageable networks that respect high-level application criteria, e.g. minimal delay or optimal energy consumption. Nonetheless, this additional level of virtualization that peer-to-peer networks impose exacerbates energy consumption mainly due to the additional message exchanges and the computation required to operate the corresponding protocols. The aim of this chapter is to review existing peer-to-peer networks and overlay protocols and approaches in the context of their energy requirements. In addition to traditional approaches, we also aim to examine recent research work on peer-to-peer solutions that have energy efficiency as their main design goal. After presenting an overview of energy concerns and considerations we present a taxonomy of these peer-to-peer systems with the goal of establishing a base of reference, as well as highlighting the relation between the different peer-to-peer systems' characteristics and features and their energy performance.
Future Generation Computer Systems
Job scheduling strategies have been studied for decades in a variety of scenarios. Due to the new... more Job scheduling strategies have been studied for decades in a variety of scenarios. Due to the new characteristics of the emerging computational systems, such as the grid and cloud, metascheduling turns out to be an important scheduling pattern because it is responsible for orchestrating resources managed by independent local schedulers and bridges the gap between participating nodes. Equally, to overcome issues such as bottleneck, single point failure, and impractical unique administrative management, which are normally led by conventional centralized or hierarchical schemes, the decentralized scheduling scheme is emerging as a promising approach because of its capability with regards to scalability and flexibility.
Simulation Modelling Practice and Theory, 2011
For decades, much work has been done to increase the effectiveness and efficiency of job sharing ... more For decades, much work has been done to increase the effectiveness and efficiency of job sharing amongst available computational resources. Resources can be organized into a variety of topologies, and recent work has shown that a decentralized distributed resource topology is a crucial but complicated scenario. This is because decentralized resources are normally grouped into independent virtual organizations (VOs) and isolated from each other by VO boundaries.
Information Integration and Web-based Applications & Services, 2003
As a typical data aggregation technique in wireless sensor networks, the spanning tree has the ab... more As a typical data aggregation technique in wireless sensor networks, the spanning tree has the ability of reducing the data redundancy and therefore decreasing the energy consumption. However, the tree construction normally ignores some other practical application requirements, such as network lifetime, convergence time and communication interference. In this case, the way how to design a tree structure subjected to multi-objectives becomes a crucial task, which is called as multi-objective steiner tree problem (MOSTP). In view of this kind of situation, a multi-objective optimization framework is proposed, and a heuristic algorithm based on jump particle swarm optimization (JPSO) with a specific double layer encoding scheme is introduced to discover Pareto optimal solution. Furthermore, the simulation results validate the feasibility and high efficiency of the novel approach by comparison with other approaches. © 2014 The Authors. Published by Elsevier B.V. Peer-review under responsibility of KES International.
Computing in Cardiology Conference, Sep 1, 2013
Aims: This study is aimed to present the simulation of several types of cardiac arrhythmias using... more Aims: This study is aimed to present the simulation of several types of cardiac arrhythmias using adaptively selected spatio-temporal resolution, involving the accuracy analysis of the experiment. Methods: We developed a spatio-temporal adaptive whole-heart simulation algorithm that handles automatically and regionally the proper resolution. The limits of upper and lower resolutions (0.2-5 mm; 1-500 μs) are a priori established, while the granularity at a given moment depends on the type, place and state of each modeled compartment. We employed the effect of muscle fiber direction, laminar sheet effect, average and minimal depolarization period, and cell inhomogeneity. Results: The estimation errors were maximal during fast depolarization phase of the activation potential. Under normal circumstances, tachycardia and bradycardia, a 2/5/10/20 times lower spatial resolution induced an about 3%/7%/12%/20% estimation error of the depolarizing front line's shape. In the presence of AF/VF, under similar simulation conditions, the estimation error raised to 7%/15%/22%/34%. Conclusion: The simulation of cardiac arrhythmia demands high spatio-temporal resolution during fast depolarization phase and in presence of AF/VF due to the irregular spread of depolarization.
Lecture Notes in Computer Science, 2013
This study focuses on the effects of artificial cardiac tissue in the excitation-contraction proc... more This study focuses on the effects of artificial cardiac tissue in the excitation-contraction process of the ventricular muscle. We developed a spatio-temporal computerized model of the whole heart that handles half millimeter sized compartments using 1 microsecond time step. We employed the effect of muscle fiber direction, laminar sheets, depolarization period and other parameters. The artificial tissue differs from the normal one in several ways, so their describing parameters are also modified. In our simulation the depolarization wave (DW) conduction speed of the artificial tissue was decreased by up to 3 times. In presence of a two centimeter wide and 2 mm thick artificial tissue slice, the maximal depolarization delay was 38 msec. Large ventricle size, low conducting speed and spaciousness of the injured ventricular tissue are the main generating factors of arrhythmia, while the location of the artificial tissue has secondary importance.
International Journal of Parallel, Emergent and Distributed Systems, Apr 1, 2011
Iet Communications, May 25, 2017
Wireless sensor networks consist of a large number of distributed sensor devices, which are conne... more Wireless sensor networks consist of a large number of distributed sensor devices, which are connected and coordinated through multi-hop routing. Due to the existence of correlated information and redundancy in measuring data, data messages can be combined and merged by performing data aggregation function in the routing process. To reduce energy consumption is a major optimisation objective of data aggregation approaches, which can be achieved by decreasing the compulsory communication load of routing. From the theoretical level, an energy model is proposed to validate the benefits of data aggregation on energy consumption. The key parameters which may impact the aggregation performance are further discussed. Finally, the corresponding simulations are implemented in order to verify the analysis conclusions from the theoretical model, and the comparison results can effectively reflect the advantages of data aggregation in different scenarios.
We present in this paper a new communication and dynamic migration technique which is based on a ... more We present in this paper a new communication and dynamic migration technique which is based on a postal delivery model providing transparent mail management and routing facilities for the implementation of dynamic load balancing strategies. Protocols are defined by concepts such as a finite state machine and a distributed address book. This postal delivery model is actually used for the design and development of Cola, a new coordination language.
ABSTRACT The advent of pervasive environments and the future Internet has brought on heterogeneit... more ABSTRACT The advent of pervasive environments and the future Internet has brought on heterogeneity, large scale and dynamicity in current networking infrastructures, which nonetheless also promote flexibility, availability, adaptability and support for mobility. With the increasing diversity in the users' networking requirements, solutions based on network virtualization techniques such as P2P overlays emerge as promising approaches to create network infrastructures with desirable characteristics. Successful deployment of such P2P overlays necessitates however efficient management mechanisms. Taking into account that static solutions are not geared towards handling the dynamics of the considered environments, as well as the diversity of users' and applications' requirements, adaptive solutions become prominent. We present here the design of an autonomic framework to concurrently manage multiple P2P overlays built on top of pervasive environments, by utilising context information and high-level policies to guide their reconfiguration. Relevant organizational requirements and design issues are also highlighted.
This paper presents an implementation of the "form template method" refactoring. This transformat... more This paper presents an implementation of the "form template method" refactoring. This transformation has not been automated yet, but has many similarities with other transformations such as clone detection and removal or method extraction. Forming a template method is a difficult process because it has to deal with code statements directly. Few abstractions and algorithms have been investigated yet, compared to transformations dealing with higher level aspects such as the classes, methods, fields and their relations. We present a new algorithm that performs this transformation in a semi-automated way on Java programs. We state the difficulties inherent to this transformation and propose solutions to handle them.
The evolution of mobile communication systems to 3G and beyond introduces requirements for flexib... more The evolution of mobile communication systems to 3G and beyond introduces requirements for flexible, customized, and ubiquitous multimedia service provision to mobile users. One must be able to know at any given time the network status, the user location, the profiles of the various entities (users, terminals, network equipment, services) involved and the policies that are employed within the system. Namely, the system must be able to cope with a large amount of context information. Present paper focuses on location and context awareness in mobile service provisioning and proposes a flexible and innovative model for user profiling. The innovation is based on the enrichment of common user profiling architectures to include location and other contextual attributes, so that enhanced adaptability and personalization can be achieved. For each location and context instance an associated User Profile instance is created and hence, service provisioning is adapted to the User Profile instance that better apply to the current context. The generic model, the structure and the content of this location-and contextsensitive User Profile, along with some related implementation issues, are discussed.
Computing in Cardiology Conference, Sep 1, 2013
Aims: In the focus of this study stand the fibroblast cells that under physiological terms are pr... more Aims: In the focus of this study stand the fibroblast cells that under physiological terms are providing structural support for the heart, but under pathophysiological conditions they can obstruct the pacemaker activity and the excitation spread function of the heart that may develop arrhythmia. Methods: We investigated the influence of high fibroblast level under several patho-physiological conditions. The simulation was performed on a 3D heart model, adopting a 0.25 mm spatial and 2 μs temporal maximal resolution. In our simulation 25% volume of the normal cardiac tissue is occupied by fibroblast, and in presence of pathological cases or aging, the fibroblast cells accumulate up to 40%-95% volume. We employed the effect of cardio-myocyte death and laminar sheets. Results: In presence of 30%/35%/40%/45%/50% fibroblast in cardiac myocyte, the spread velocity of depolarization was obstructed by 3%/7%/11%/15%/19%, while regional inflammation and injures locally reduced the propagation speed of excitation by at least 20%. Tissue aging has reduced cardiac pacemaker activity and increased the possibility of irregular cardiac activity. Conclusion: High fibroblast levels not only detains significantly the spread of excitation, but it can obstruct the depolarization wave evolving cardiac arrhythmia.
Peer-to-peer (P2P) networks and overlays have gained wide acceptance over the last years, attribu... more Peer-to-peer (P2P) networks and overlays have gained wide acceptance over the last years, attributed mainly to their inherent support for handling concerns such as scalability, dynamicity and heterogeneity that are commonly found in current networking infrastructures. In this respect, peer-to-peer overlay approaches serve as networking abstraction layers, thus scaling down the degree of complexity and leading to more manageable networks that respect high-level application criteria, e.g. minimal delay or optimal energy consumption. Nonetheless, this additional level of virtualization that peer-to-peer networks impose exacerbates energy consumption mainly due to the additional message exchanges and the computation required to operate the corresponding protocols. The aim of this chapter is to review existing peer-to-peer networks and overlay protocols and approaches in the context of their energy requirements. In addition to traditional approaches, we also aim to examine recent research work on peer-to-peer solutions that have energy efficiency as their main design goal. After presenting an overview of energy concerns and considerations we present a taxonomy of these peer-to-peer systems with the goal of establishing a base of reference, as well as highlighting the relation between the different peer-to-peer systems' characteristics and features and their energy performance.
Future Generation Computer Systems
Job scheduling strategies have been studied for decades in a variety of scenarios. Due to the new... more Job scheduling strategies have been studied for decades in a variety of scenarios. Due to the new characteristics of the emerging computational systems, such as the grid and cloud, metascheduling turns out to be an important scheduling pattern because it is responsible for orchestrating resources managed by independent local schedulers and bridges the gap between participating nodes. Equally, to overcome issues such as bottleneck, single point failure, and impractical unique administrative management, which are normally led by conventional centralized or hierarchical schemes, the decentralized scheduling scheme is emerging as a promising approach because of its capability with regards to scalability and flexibility.
Simulation Modelling Practice and Theory, 2011
For decades, much work has been done to increase the effectiveness and efficiency of job sharing ... more For decades, much work has been done to increase the effectiveness and efficiency of job sharing amongst available computational resources. Resources can be organized into a variety of topologies, and recent work has shown that a decentralized distributed resource topology is a crucial but complicated scenario. This is because decentralized resources are normally grouped into independent virtual organizations (VOs) and isolated from each other by VO boundaries.
Information Integration and Web-based Applications & Services, 2003
As a typical data aggregation technique in wireless sensor networks, the spanning tree has the ab... more As a typical data aggregation technique in wireless sensor networks, the spanning tree has the ability of reducing the data redundancy and therefore decreasing the energy consumption. However, the tree construction normally ignores some other practical application requirements, such as network lifetime, convergence time and communication interference. In this case, the way how to design a tree structure subjected to multi-objectives becomes a crucial task, which is called as multi-objective steiner tree problem (MOSTP). In view of this kind of situation, a multi-objective optimization framework is proposed, and a heuristic algorithm based on jump particle swarm optimization (JPSO) with a specific double layer encoding scheme is introduced to discover Pareto optimal solution. Furthermore, the simulation results validate the feasibility and high efficiency of the novel approach by comparison with other approaches. © 2014 The Authors. Published by Elsevier B.V. Peer-review under responsibility of KES International.
Computing in Cardiology Conference, Sep 1, 2013
Aims: This study is aimed to present the simulation of several types of cardiac arrhythmias using... more Aims: This study is aimed to present the simulation of several types of cardiac arrhythmias using adaptively selected spatio-temporal resolution, involving the accuracy analysis of the experiment. Methods: We developed a spatio-temporal adaptive whole-heart simulation algorithm that handles automatically and regionally the proper resolution. The limits of upper and lower resolutions (0.2-5 mm; 1-500 μs) are a priori established, while the granularity at a given moment depends on the type, place and state of each modeled compartment. We employed the effect of muscle fiber direction, laminar sheet effect, average and minimal depolarization period, and cell inhomogeneity. Results: The estimation errors were maximal during fast depolarization phase of the activation potential. Under normal circumstances, tachycardia and bradycardia, a 2/5/10/20 times lower spatial resolution induced an about 3%/7%/12%/20% estimation error of the depolarizing front line's shape. In the presence of AF/VF, under similar simulation conditions, the estimation error raised to 7%/15%/22%/34%. Conclusion: The simulation of cardiac arrhythmia demands high spatio-temporal resolution during fast depolarization phase and in presence of AF/VF due to the irregular spread of depolarization.
Lecture Notes in Computer Science, 2013
This study focuses on the effects of artificial cardiac tissue in the excitation-contraction proc... more This study focuses on the effects of artificial cardiac tissue in the excitation-contraction process of the ventricular muscle. We developed a spatio-temporal computerized model of the whole heart that handles half millimeter sized compartments using 1 microsecond time step. We employed the effect of muscle fiber direction, laminar sheets, depolarization period and other parameters. The artificial tissue differs from the normal one in several ways, so their describing parameters are also modified. In our simulation the depolarization wave (DW) conduction speed of the artificial tissue was decreased by up to 3 times. In presence of a two centimeter wide and 2 mm thick artificial tissue slice, the maximal depolarization delay was 38 msec. Large ventricle size, low conducting speed and spaciousness of the injured ventricular tissue are the main generating factors of arrhythmia, while the location of the artificial tissue has secondary importance.
International Journal of Parallel, Emergent and Distributed Systems, Apr 1, 2011
Iet Communications, May 25, 2017
Wireless sensor networks consist of a large number of distributed sensor devices, which are conne... more Wireless sensor networks consist of a large number of distributed sensor devices, which are connected and coordinated through multi-hop routing. Due to the existence of correlated information and redundancy in measuring data, data messages can be combined and merged by performing data aggregation function in the routing process. To reduce energy consumption is a major optimisation objective of data aggregation approaches, which can be achieved by decreasing the compulsory communication load of routing. From the theoretical level, an energy model is proposed to validate the benefits of data aggregation on energy consumption. The key parameters which may impact the aggregation performance are further discussed. Finally, the corresponding simulations are implemented in order to verify the analysis conclusions from the theoretical model, and the comparison results can effectively reflect the advantages of data aggregation in different scenarios.
We present in this paper a new communication and dynamic migration technique which is based on a ... more We present in this paper a new communication and dynamic migration technique which is based on a postal delivery model providing transparent mail management and routing facilities for the implementation of dynamic load balancing strategies. Protocols are defined by concepts such as a finite state machine and a distributed address book. This postal delivery model is actually used for the design and development of Cola, a new coordination language.
ABSTRACT The advent of pervasive environments and the future Internet has brought on heterogeneit... more ABSTRACT The advent of pervasive environments and the future Internet has brought on heterogeneity, large scale and dynamicity in current networking infrastructures, which nonetheless also promote flexibility, availability, adaptability and support for mobility. With the increasing diversity in the users' networking requirements, solutions based on network virtualization techniques such as P2P overlays emerge as promising approaches to create network infrastructures with desirable characteristics. Successful deployment of such P2P overlays necessitates however efficient management mechanisms. Taking into account that static solutions are not geared towards handling the dynamics of the considered environments, as well as the diversity of users' and applications' requirements, adaptive solutions become prominent. We present here the design of an autonomic framework to concurrently manage multiple P2P overlays built on top of pervasive environments, by utilising context information and high-level policies to guide their reconfiguration. Relevant organizational requirements and design issues are also highlighted.