Experimentation Research Papers - Academia.edu (original) (raw)
Working across earth and social sciences, this article reevaluates resilience’s conceptual framework, drawing out alternative pathways for understanding and responding to the dislocations of the Anthropocene. Via a critical reading of the... more
Working across earth and social sciences, this article reevaluates resilience’s conceptual framework, drawing out alternative pathways for understanding and responding to the dislocations of the Anthropocene. Via a critical reading of the Anthropocene with the help of resilience’s adaptive cycle heuristic, I locate the possibility of new forms of life in its phase of release and reorganisation: the back loop. More than a brief, negative phase to govern or navigate, I argue that the back loop o ers the possibility for a practical orientation to the Anthropocene based on experimentation with new uses, release of old frameworks, and allowance for the unknown. Inhabiting the back loop, as I call it, articulates an ethos couched not in fear or survival but rather creative and technical audacity in unsafe operating space, as embodied already in a variegated landscape of practitioners.
- by
- •
- Geography, Political Theory, Resilience, Ecology
The fate of black carbon deposited on the soil surface after burning of harvesting residues is poorly understood. In this study we quantified horizontal as well as vertical transport of total pyrogenic carbon and its chemically most... more
The fate of black carbon deposited on the soil surface after burning of harvesting residues is poorly understood. In this study we quantified horizontal as well as vertical transport of total pyrogenic carbon and its chemically most stable part (OREC) during a rainfall experiment after an experimental burn. The aim of the study was to assess the importance of splash erosion versus runoff for the export of total pyrogenic carbon and OREC from soil. Our methodological approach included the comparison of 1 m 2 plots with splash erosion to 1 m 2 plots where splash erosion was prevented by mosquito nets. On these plots, we quantified total pyrogenic carbon and OREC in the soil before and after the rainfall simulation. Additionally, we assessed the amount of total pyrogenic C and OREC exported from the plots and transported vertically into the soil. Our results showed that 7-55% of pyrogenic carbon was subject to horizontal and another 23-46% of initial deposited pyrogenic C subject to vertical transport. The export of total pyrogenic carbon and OREC at least doubled on plots where splash was allowed. OREC was less prone to vertical transport than total pyrogenic C. We explained the contrasting observation for both carbon types by the favourable floating behaviour of OREC compared to higher sedimentation rates of other pyrogenic C types, which may partly lead to redistribution of the latter. The presence or absence of splash erosion was found to determine the magnitude of horizontal versus vertical transport and thereby the fate of black C in soil.
This study reports on the evaluation of four meeting techniques for decision-making by small groups. Two face-to-face and two computer conferencing techniques were evaluated by small groups solving business cases. Particular groups varied... more
This study reports on the evaluation of four meeting techniques for decision-making by small groups. Two face-to-face and two computer conferencing techniques were evaluated by small groups solving business cases. Particular groups varied widely in their satisfaction with different meeting techniques, but there was no significant difference in the quality of the group decisions.
Efficient processing of top- k queries is a crucial requirement in many interactive environments that involve massive amounts of data. In particular, efficient top- k processing in domains such as the Web, multimedia search, and... more
Efficient processing of top- k queries is a crucial requirement in many interactive environments that involve massive amounts of data. In particular, efficient top- k processing in domains such as the Web, multimedia search, and distributed systems has shown a great impact on performance. In this survey, we describe and classify top- k processing techniques in relational databases. We discuss different design dimensions in the current techniques including query models, data access methods, implementation levels, data and query certainty, and supported scoring functions. We show the implications of each dimension on the design of the underlying techniques. We also discuss top- k queries in XML domain, and show their connections to relational approaches.
Bespace is a synchronous web3D lecture hall and performance system that seeks to tap the educational and entertainment potential of virtual space. Its underlying interactive methodology is focused on the efficient delivery of information... more
Bespace is a synchronous web3D lecture hall and performance system that seeks to tap the educational and entertainment potential of virtual space. Its underlying interactive methodology is focused on the efficient delivery of information within the space. The entire virtual environment is treated as a one interface. Principles of human computer interaction are applied to the activities, avatars, and architecture. The outcome is an environment that utilizes the power of a digitally created reality and does not sacrifice usability in pursuit of realistic appearances. Bespace’s interactive methodology seeks to implement the promises made by the early pioneers of virtual space. ACM Classification: H5.2 [Information interfaces and
Abstract Although there are many archaeological and ethnographic evidences for the use of slate and similar rocks for flaking purpose, they raised little interest among specialists, leading to a general ignorance of specific problems... more
Abstract Although there are many archaeological and ethnographic evidences for the use of slate and similar rocks for flaking purpose, they raised little interest among specialists, leading to a general ignorance of specific problems associated with these raw materials. ...
In this paper, a combined use of dimensional analysis (DA) and modern statistical design of experiment (DOE) methodologies is proposed for a hydrodynamics experiment where there are a large number of variables. While DA is well-known, DOE... more
In this paper, a combined use of dimensional analysis (DA) and modern statistical design of experiment (DOE) methodologies is proposed for a hydrodynamics experiment where there are a large number of variables. While DA is well-known, DOE is still unfamiliar to most ocean engineers although it has been shown to be useful in many engineering and non-engineering applications. To introduce
El artículo presenta la experiencia de la asignatura Física Introductoria en la modalidad semipresencial en la PUCMM/RSTA. La implementación del proyecto piloto tuvo varias fases incluyendo desde la preparación de materiales hasta el... more
El artículo presenta la experiencia de la asignatura Física Introductoria en la modalidad semipresencial en la PUCMM/RSTA. La implementación del proyecto piloto tuvo varias fases incluyendo desde la preparación de materiales hasta el establecimiento de los elementos de la estrategia y la orientación de los estudiantes en la nueva modalidad. Los resultados se evaluaron utilizando diversas herramientas: cuestionarios, encuestas on-line, entrevistas y pruebas de rendimiento. Éstas fueron orientadas a aspectosrelacionados con el estudiante, la asignatura, el profesorado, el contenido, la comunicación y el entorno virtual. Los resultados obtenidos fueron satisfactorios en cada una de las áreas evaluadas, incluida el rendimiento de los estudiantes.
Demand prepaging was long ago proposed as a method for taking advantage of high disk bandwidths and avoiding long disk latencies by fetching, at each page fault, not only the demanded page but also other pages predicted to be used soon.... more
Demand prepaging was long ago proposed as a method for taking advantage of high disk bandwidths and avoiding long disk latencies by fetching, at each page fault, not only the demanded page but also other pages predicted to be used soon. Studies performed more than twenty years ago found that demand prepaging would not be generally beneficial. Those studies failed to examine thoroughly the interaction between prepaging and main memory caching. It is unclear how many main memory page frames should be allocated to cache pages that were prepaged but have not yet been referenced. This issue is critical to the efficacy of any demand prepaging policy. In this paper, we examine prepaged allocation and its interaction with two other important demand prepaging parameters: the degree, which is the number of extra pages that may be fetched at each page fault, and the predictor that selects which pages to prepage. The choices for these two parameters, the reference behavior of the workload, and the main memory size all substantially affect the appropriate choice of prepaged allocation. In some situations, demand prepaging cannot provide benefit, as any allocation to prepaged pages will increase page faults, while in other situations, a good choice of allocation will yield a substantial reduction in page faults. We will present a mechanism that dynamically adapts the prepaged allocation on-line, as well as experimental results that show that this mechanism typically reduces page faults by 10 to 40% and sometimes by more than 50%. In those cases where demand prepaging should not be used, the mechanism correctly allocates no space for prepaged pages and thus does not increase the number of page faults. Finally, we will show that prepaging offers substantial benefits over the simpler solution of using larger pages, which can substantially increase page faults.
Within the scope of ANSWER, an EC-supported research project, a new artistic notation system, called DirectorNotation, is being developed for describing media content, much like notes are used to document a music composition. Our current... more
Within the scope of ANSWER, an EC-supported research project, a new artistic notation system, called DirectorNotation, is being developed for describing media content, much like notes are used to document a music composition. Our current research focuses on developing “notation-enabled ” offline authoring tools, but interactive applications are an anticipated extension. Here, we summarise the key aspects of DirectorNotation and the ANSWER project, from the point of view of a multimedia-technology audience interested in content-based manipulation of media. Categories and Subject Descriptors
The classification accuracy statement is the basis of the evaluation of a classification's fitness for purpose. Accuracy statements are also used for applications such as the evaluation of classifiers, with attention focused especially on... more
The classification accuracy statement is the basis of the evaluation of a classification's fitness for purpose. Accuracy statements are also used for applications such as the evaluation of classifiers, with attention focused especially on differences in the accuracy with which data are classified. Many factors influence the value of a classification accuracy assessment and evaluation programme. This paper focuses on the size of the testing set(s), and its impacts on accuracy assessment and comparison. Testing set size is important as an inappropriately large or small sample could lead to limited and sometimes erroneous assessments of accuracy and of differences in accuracy. In this paper the basic statistical principles of sample size determination are outlined. Some of the basic issues of sample size determination for accuracy assessment and accuracy comparison are discussed. With the latter, the researcher should specify the effect size (minimum meaningful difference), significance level and power used in an analysis and ideally also fit confidence limits to estimates. This will help design a study as well as aid interpretation. In particular, it will help avoid problems such as under-powered analyses and provide a richer information base for classification evaluation. Central to the argument is a discussion of Type II errors and their control. The paper includes equations that could be used to determine sample sizes for common applications in remote sensing, using both independent and related samples.
High-pressure supported freezing processes need a more efficient refrigeration technique to be applied at industrial level. A cooling method consisting in the circulation of a refrigerant in ebullition around the product in the vessel has... more
High-pressure supported freezing processes need a more efficient refrigeration technique to be applied at industrial level. A cooling method consisting in the circulation of a refrigerant in ebullition around the product in the vessel has been tested on a lab-scale prototype built for that purpose. The cooling kinetic of a mixture of ethanol, ethylene glycol and water (a usual pressurizing medium) was followed, recording temperatures in the whole sample. A mathematical model has been developed to describe heat transfer during cooling of the sample in the vessel. The heat transfer coefficient between the refrigerant and the vessel was determined by a fitting procedure between the numerical simulation results and the experimental measurements. This model should be used to predict the cooling kinetics in other conditions (other products, larger vessels) and to optimise the process.
The aim of this proposal is to present an original approach to the study and preservation of rock art caves. A multidisciplinary study of cave wall alteration will be performed to understand the impact of environmental context on the... more
The aim of this proposal is to present an original approach to the study and preservation of rock art caves. A multidisciplinary study of cave wall alteration will be performed to understand the impact of environmental context on the evolution of wall surfaces. The approach involves the choice of a cave with characteristics similar to painted caves in the studied area (Vézère Valley in Dordogne, France): e.g., cave wall alteration, lithology, morphology, etc. This selected cave is intended to become a laboratory cave, monitored for the acquisition of chemical, physical and biological environmental data on bedrock, air and fluids along with their characteristics. A cave without art or archaeological interest has been chosen specifically to conduct experiments, to make in situ analysis and to obtain samples. The results are to be completed by specific observations in several other caves and compiled in a database.
- by Valme Jurado and +2
- •
- Archaeology, Geochemistry, Methodology, Archaeological Science
In the context of an aging population, it is important today to take into account the needs of our seniors to help them better live their aging. The design of color and lighting ambiance contributes in the practice of their daily... more
In the context of an aging population, it is important today to take into account the needs of our
seniors to help them better live their aging. The design of color and lighting ambiance
contributes in the practice of their daily activities in their living environment. This study
postulates a protocol of good practices in terms of color and lighting to design a visual
environment adapted to the needs of these users. The protocol is based on a combination of
chromatic and lighting expertise. Chromatic color matching based on Natural Color System
tools is combined with a photometric survey to characterize the visual environment. These data
make it possible to establish a protocol used to design new chromatic ranges applied to new
environments or to evaluate the applicability of the existing ranges.
Page 1. Deterministic Algorithms for 3-D Diameter and some 2-D Lower Envelopes * Edgar A. Ramos t Abstract We present a deterministic algorithm for computing the diameter of a set of n points in R3; its run-ning time O(n log n) is... more
Page 1. Deterministic Algorithms for 3-D Diameter and some 2-D Lower Envelopes * Edgar A. Ramos t Abstract We present a deterministic algorithm for computing the diameter of a set of n points in R3; its run-ning time O(n log n) is worst-case optimal. ...
En estos últimos años, el concepto de smart city se ha impuesto en los discursos y prácticas urbanas, transformándose en un sello cada vez más anhelado por las me-trópolis de todo el mundo. El denominado smart urbanism surge de la premisa... more
En estos últimos años, el concepto de smart city se ha impuesto en los discursos y prácticas urbanas, transformándose en un sello cada vez más anhelado por las me-trópolis de todo el mundo. El denominado smart urbanism surge de la premisa que indica que, por medio de la aplicación de soluciones digitales, las urbes pueden transformarse en territorios más eficientes, predecibles y sustentables. Pero, ¿qué escenarios y consecuencias sociotécnicas plantea la extensión de sensores y algo-ritmos inteligentes en la ciudad? En este artículo exploro algunos de los debates contemporáneos y desafíos analíticos que emergen frente a la noción de ciudad in-teligente. Me sirvo de claves interpretativas provenientes de los Science and Technology Studies (STS) y discuto la naturaleza experimental de la noción de smart city y los efectos que presentan los procesos de dataficación sobre los espacios ur-banos y los individuos. Concluyo con una cita a la figura conceptual del idiota de Isabelle Stengers, enunciando una reflexión crítica sobre la visión de ciudad que contiene el paradigma de la smart city.
This paper examines the potential integration of Living Labs concepts of open and user driven innovation with Future Internet experimentally driven research approaches, in order to accelerate the user-driven development of Future Internet... more
This paper examines the potential integration of Living Labs concepts of open and user driven innovation with Future Internet experimentally driven research approaches, in order to accelerate the user-driven development of Future Internet enabled services towards Smart Cities. Two key issues are underlying this integration: strengthening user involvement in experimental Internet research, and providing access to common resources such as testbed facilities and living lab resources. To explore the opportunities for such integration, three case studies from current FP7-ICT projects are discussed: SmartSantander, TEFIS and ELLIOT. A framework is proposed facilitating the sharing of resources offered by existing Smart City platforms, testbeds and living labs facilities as a basis for partnership agreements implementing open innovation approaches for Smart Cities.
Building optimization involving multiple objectives is generally an extremely time-consuming process. The GAINN approach presented in this study first uses a simulation-based Artificial Neural Network (ANN) to characterize building... more
Building optimization involving multiple objectives is generally an extremely time-consuming process. The GAINN approach presented in this study first uses a simulation-based Artificial Neural Network (ANN) to characterize building behaviour, and then combines this ANN with a multiobjective Genetic Algorithm (NSGA-II) for optimization. The methodology has been used in the current study for the optimization of thermal comfort and energy consumption in a residential house. Results of ANN training and validation are first discussed. Two optimizations were then conducted taking variables from HVAC system settings, thermostat programming, and passive solar design. By integrating ANN into optimization the total simulation time was considerably reduced compared to classical optimization methodology. Results of the optimizations showed significant reduction in terms of energy consumption as well as improvement in thermal comfort. Finally, thanks to the multiobjective approach, dozens of potential designs were revealed, with a wide range of trade-offs between thermal comfort and energy consumption.
Recent years have witnessed significant efforts on developing computer-based technologies for making maps accessible to people who are blind. Existing work has largely focused on the technological aspects of the problem without adequate... more
Recent years have witnessed significant efforts on developing computer-based technologies for making maps accessible to people who are blind. Existing work has largely focused on the technological aspects of the problem without adequate attention to the humancomputer interaction issues. Using an audio-tactile system as the platform, we present a focused study on such HCI issues for supporting a blind user's effective navigation of a map in getting directions. The ultimate goal of the research is to establish comprehensive design guidelines for building technologies that truly serve the needs of the users in the application of accessible maps. The results of our current study suggest that the proposed designs are effective for supporting a blind user in obtaining directions from online maps.
There is a growing interest in empirical study in software engineering, both for validating mature technologies and for guiding improvements of less-mature technologies. This paper introduces an empirical methodology, based on experiences... more
There is a growing interest in empirical study in software engineering, both for validating mature technologies and for guiding improvements of less-mature technologies. This paper introduces an empirical methodology, based on experiences garnered over more than two decades of work by the Empirical Software Engineering Group at the University of Maryland and related organizations, for taking a newly proposed improvement to development processes from the conceptual phase through transfer to industry. The methodology presents a series of questions that should be addressed, as well as the types of studies that best address those questions. The methodology is illustrated by a specific research program on inspection processes for Object-Oriented designs. Specific examples of the studies that were performed and how the methodology impacted the development of the inspection process are also described.
In this paper, we present the results from two choice experiments that were employed to measure the benefits from changed levels of biodiversity due to nature-oriented silviculture in Lower Saxony, Germany. We also discuss different... more
In this paper, we present the results from two choice experiments that were employed to measure the benefits from changed levels of biodiversity due to nature-oriented silviculture in Lower Saxony, Germany. We also discuss different variants of calculating welfare measures for forest management strategies. The variants differ, among other things, with respect to taking the alternative specific constant (ASC), indicating the status quo option, into account or not. While including the ASC results in our study in overall negative welfare measures, excluding it causes positive measures. However, both variants might be inappropriate because of an underestimation or an overestimation of the benefits. Avoiding an underestimation or an overestimation would require differentiation between respondents who demand compensation for a move away from the status quo, and respondents who would not suffer a loss but chose the status quo alternative because of choice task complexity, for instance.
Is it possible to reduce the expected response time of every request at a web server, simply by changing the order in which we schedule the requests? That is the question we ask in this paper.This paper proposes a method for improving the... more
Is it possible to reduce the expected response time of every request at a web server, simply by changing the order in which we schedule the requests? That is the question we ask in this paper.This paper proposes a method for improving the performance of web servers servicing static HTTP requests. The idea is to give preference to requests for
The question of when and how international orders change remains a pertinent issue of International Relations theory. This article develops the model of pragmatic ordering to conceptualise change. The model of pragmatic ordering... more
The question of when and how international orders change remains a pertinent issue of International Relations theory. This article develops the model of pragmatic ordering to conceptualise change. The model of pragmatic ordering synthesises recent theoretical arguments for a focus on ordering advanced in-practice theory, pragmatist philosophy, and related approaches. It also integrates evidence from recent global governance research. We propose a five-stage model. According to the model, once a new problem emerges (problematisation), informality allows for experimenting with new practices and developing new knowledge (informalisation and experimentation). Once these experimental practices become codified, and survive contestation, they increasingly settle (codification) and are spread through learning and translation processes (consolidation). We draw on the rise of the maritime security agenda as a paradigmatic case and examine developments in the Western Indian Ocean region to illustrate each of these stages. The article draws attention to the substantial reorganisation of maritime space occurring over the past decade and offers an innovative approach for the study of orders and change.
Rapid product lifecycles and high development costs pressure manufacturing "rms to cut not only their development times (time-to-market), but also the time to reach full capacity utilization (time-to-volume). The period between completion... more
Rapid product lifecycles and high development costs pressure manufacturing "rms to cut not only their development times (time-to-market), but also the time to reach full capacity utilization (time-to-volume). The period between completion of development and full capacity utilization is known as production ramp-up. During that time, the new production process is ill understood, which causes low yields and low production rates. This paper analyzes the interactions among capacity utilization, yields, and process improvement (learning). We model learning in the form of deliberate experiments, which reduce capacity in the short run. This creates a trade-o! between experiments and production. High selling prices during ramp-up raise the opportunity cost of experiments, yet early learning is more valuable than later learning. We formalize the resulting intertemporal trade-o! between the short-term opportunity cost of capacity and the long term value of learning as a dynamic program. The paper also examines the tradeo! between production speed and yield/quality, where faster production rates lead to more defects. Finally, we show what happens if managers misunderstand the sources of learning. (C. Terwiesch).
- by Pega Hrnjak and +1
- •
- Engineering, Refrigeration, Heat Transfer, Performance
In this paper, we report some reflections on science and education, in relation to teaching and research in the field of complex and controversial socio-environmental issues. Starting from an examination of the literature on the... more
In this paper, we report some reflections on science and education, in relation to teaching and research in the field of complex and controversial socio-environmental issues. Starting from an examination of the literature on the epistemological aspects of the science of controversial issues, and introducing the perspective of complexity, the article argues for a complexity of content, context, and method in understanding current problems. Focusing on a model of learning which includes dialogical and reflective approaches, the final part of the article reports on aspect of the authors' experimental practice with role-play for dealing with complex issues. The review of the literature and our experience of action -research introduce a view of education which promotes young people's awareness of multiple points of view, an ability to establish relationships between processes, scales, and contexts which may be nonlinearly related, and practice with creative and nonviolent forms of interrelations with others. Such an approach in science education is coherent with a scenario of planet sustainability based on ecological webs and equity principles.
Participatory spatial planning and decision making requires a combination of software tools for group decision support, individual decision support and geographic analysis and presentation. This article presents a respective approach that... more
Participatory spatial planning and decision making requires a combination of software tools for group decision support, individual decision support and geographic analysis and presentation. This article presents a respective approach that integrates two software tools which were originally developed independently. One tool, Dito, is a Java application for the World Wide Web designed to facilitate structured argumentation and discourses. The other tool, CommonGIS, provides Java-based web-enabled services for the interactive, explorative generation and analysis of thematic maps, and it also supports multi-criteria decision making. The evolution of the integrated system is reviewed from first experiments in 2001, the resulting requirements and a succession of prototypes up to the latest solution. The focus of the article lies in the design of this solution.
In petabyte-scale distributed file systems that decouple read and write from metadata operations, behavior of the metadata server cluster will be critical to overall system performance and scalability. We present a dynamic subtree... more
In petabyte-scale distributed file systems that decouple read and write from metadata operations, behavior of the metadata server cluster will be critical to overall system performance and scalability. We present a dynamic subtree partitioning and adaptive metadata management system designed to efficiently manage hierarchical metadata workloads that evolve over time. We examine the relative merits of our approach in the context of traditional workload partitioning strategies, and demonstrate the performance, scalability and adaptability advantages in a simulation environment.
Objectives. The learning style preferences of public health students are investigated. Learning styles, as defined by Kolb,(3) refer to the four distinct manners of processing information. Methods. Students' learning styles are analyzed... more
Objectives. The learning style preferences of public health students are investigated. Learning styles, as defined by Kolb,(3) refer to the four distinct manners of processing information. Methods. Students' learning styles are analyzed for associations by gender, occupation, and public health program. The value of learning styles to predict students' preference for oral presentation versus written exams, performance on different types of exams, and course grade are studied. Results. Learning styles of the students sampled were neither predominantly of one learning style nor evenly distributed. Learning style preferences did not vary significantly according to gender, occupation, or public health program. Learning styles also did not predict the students' choice of oral presentation or written exam. Assimilators, however, scored significantly higher than the other three learning styles on the theoretical exam and in their course grade. Discussion. The authors suggest that public health instructors employ a variety of teaching methods and evaluative opportunities when class composition is initially assessed as having diverse learning styles. This 'instructional pluralism' is necessary to facilitate learning, maximize participation, and permit multiple pathways for students to demonstrate educational performance.
Systematic reviews are generally regarded by professionals in the field of evidence-based medicine as the highest level of medical evidence, and they are already standard practice for clinical studies. However, they are not yet widely... more
Systematic reviews are generally regarded by professionals in the field of evidence-based medicine as the highest level of medical evidence, and they are already standard practice for clinical studies. However, they are not yet widely used nor undertaken in the field of animal experimentation, even though there is a lot to be gained from the process. Therefore, a gold standard publication checklist (GSPC) for animal studies is presented in this paper. The items on the checklist have been selected on the basis of a literature analysis and the resulting scientific evidence that these factors are decisive in determining the outcome of animal studies. In order to make future systematic reviews and meta-analyses of animal studies possible, to allow others to replicate and build on work previously published, diminish the number of animals needed in animal experimentation (reduction), improve animal welfare (refinement) and, above all, improve the quality of scientific papers on animal exp...
Despite of several years of innovative research, indoor localization is still not mainstream. Existing techniques either employ cumbersome fingerprinting, or rely upon the deployment of additional infrastructure. Towards a solution that... more
Despite of several years of innovative research, indoor localization is still not mainstream. Existing techniques either employ cumbersome fingerprinting, or rely upon the deployment of additional infrastructure. Towards a solution that is easier to adopt, we propose CU P I D, which is free from these restrictions, yet is comparable in accuracy. While existing WiFi based solutions are highly susceptible to indoor multipath, CUPID utilizes physical layer (PHY) information to extract the signal strength and the angle of only the direct path, successfully avoiding the effect of multipath reflections. Our main observation is that natural human mobility, when combined with PHY layer information, can help in accurately estimating the angle and distance of a mobile device from an wireless access point (AP). Real-world indoor experiments using off-the-shelf wireless chipsets confirm the feasibility of CUPID. In addition, while previous approaches rely on multiple APs, CUPID is able to localize a device when only a single AP is present. When a few more APs are available, CUPID can improve the median localization error to 2.7m, which is comparable to schemes that rely on expensive fingerprinting or additional infrastructure.
The contribution presents a multi-agent technology in the domain of the air-traffic control of several autonomous aerial vehicles (manned as well as unmanned). The system has been validated mainly by the collision avoidance (CA) tasks.... more
The contribution presents a multi-agent technology in the domain of the air-traffic control of several autonomous aerial vehicles (manned as well as unmanned). The system has been validated mainly by the collision avoidance (CA) tasks. Several cooperative and non-cooperative CA methods have been integrated in the system to validate and compare their properties in the scalable experiments. The AGENTFLY system is suitable also for the developing and testing of algorithms for the collective flight control. The operation of the underlying multi-agent system has been integrated with freely available geographical and tactical data sources. The system provides real time 2D/3D visualization and also the web-access component.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and... more
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
The complexity of network management is widely recognized as one of the biggest challenges facing the Internet today. Point solutions for individual problems further increase system complexity while not addressing the underlying causes.... more
The complexity of network management is widely recognized as one of the biggest challenges facing the Internet today. Point solutions for individual problems further increase system complexity while not addressing the underlying causes. In this paper, we argue that many network-management problems stem from the same root cause-the need to maintain consistency between the physical and logical configuration of the routers. Hence, we propose VROOM (Virtual ROuters On the Move), a new network-management primitive that avoids unnecessary changes to the logical topology by allowing (virtual) routers to freely move from one physical node to another. In addition to simplifying existing network-management tasks like planned maintenance and service deployment, VROOM can also help tackle emerging challenges such as reducing energy consumption. We present the design, implementation, and evaluation of novel migration techniques for virtual routers with either hardware or software data planes. Our evaluation shows that VROOM is transparent to routing protocols and results in no performance impact on the data traffic when a hardware-based data plane is used.