Metamodelling Research Papers - Academia.edu (original) (raw)
Simulations are commonly used to assess building performance with respect to energy demand and indoor environment. However, the use of performance simulations is limited during the early stages characterized by large uncertainties. This... more
Simulations are commonly used to assess building performance with respect to energy demand and indoor environment. However, the use of performance simulations is limited during the early stages characterized by large uncertainties. This industrial Ph.D. study presents a novel simulation approach that relies on thousands of simulations representing the multidimensional design space. Interactive visualizations enable decision-makers to explore, in real-time, the vast design space and identify favorable solutions which satisfy the needs of different stakeholders. Sensitivity analysis helps reveal important
design parameters that require the most attention when seeking to improve building performance. Fast metamodels facilitate immediate feedback on design changes and reduce time-consumption related to performance assessment. Ultimately, the work described in this thesis and on buildingdesign.moe.dk facilitates proactive guidance and supports collaboration between building owners, architects, engineers, and contractors. This helps the design team to create buildings with high performance and minimum costs.
Scientific and naturalistic assumptions dominate the field of communication theory, making theoretical engagement with spiritual modes of communication difficult. Current models of spiritual communication fail to elaborate on how,... more
Scientific and naturalistic assumptions dominate the field of communication theory, making theoretical engagement with spiritual modes of communication difficult. Current models of spiritual communication fail to elaborate on how, exactly, researchers from different traditions can engage with this new tradition. This study uses Craig's (1999) constitutive metamodel to create a model of the Spiritual Tradition of communication theory, which views communication as mimetic of the atemporal. Situating the Spiritual Tradition within the metamodel will allow researchers to understand better the dialogical/dialectical tension created by different understandings of communication.
In metamodern culture, handicraft is everywhere. As I argue, the ‘artisanal turn’ is not just a symptom of postmodern nostalgia, i.e. past ‘options’ or ‘instances’ allowed to make a second appearance. Rather, it is our very experience of... more
In metamodern culture, handicraft is everywhere. As I argue, the ‘artisanal turn’ is not just a symptom of postmodern nostalgia, i.e. past ‘options’ or ‘instances’ allowed to make a second appearance. Rather, it is our very experience of time that has changed. What seemed old can appear authentically new again. Today's interest in crafts and craftsmanship thus has less to do with the idolisation of pre-industrial handicrafts by John Ruskin or the anti-industrial Arts and Crafts movement founded by William Morris than with Bauhaus. Ever since, craft has been emancipating itself from the intimacy of the studio and the corresponding closed guild mind that values only the specifics of its metier and its skills. This transformation marks less the disappearance of craftsmanship after the end of art than its development into a general media literacy. Given a certain material, what is it capable of? It was perhaps in this metamodern sense that Deleuze and Guattari, in A Thousand Plateaus, proposed the concept of the modern artist as “cosmic artisan”. I offer a mannerist genealogy for metamodern crafts and craftsmanship. Starting from tensions brought about in matter-form relationships by contemporary digital design practices, I retrospectively problematise the division of labor between design and craft at the very moment it first appeared. In this way I expose an informal or cosmic dimension in both mannerist and metamodern craftsmanship, characterized by an infinite and continuous variation of manners rather than forms. I will then develop some of the ontological, epistemological and political implications of this dimension in the light of recent developments in Theory such as New Materialism, Object-Oriented Ontology (OOO), and the care for the plastic relationality of the Self, which themselves are interpreted as expressions of a metamodern sensibility.
This work proposes a multi-scale methodology based on the use of surrogate models function of random variables, to evaluate the residual reliability of concrete structures suffering from alkali-aggregate reaction (AAR), for a better... more
This work proposes a multi-scale methodology based on the use of surrogate models function of random variables, to evaluate the residual reliability of concrete structures suffering from alkali-aggregate reaction (AAR), for a better maintenance purpose. Surrogate models, based on polynomial chaos expansion of the parameters of a shape function (sigmoid in the studied cases), have been constituted at several scales, in particular in order to reduce computation time of the underlying physical models. At the microscopic scale, the AAR model employed is that developed by Multon et al in 2009, initially comprising about twenty potential random variables. At the end of a Morris sensitivity analysis, the surrogate model enables to reproduce the expansion curve over time of the representative elementary volume as a function of nine random variables. The use of the built-in surrogate model in predicting the mechanical effects of AAR expansion on a concrete core required to take into account the anisotropy of these effects by improving the weight functions proposed by Saouma and Perotti in 2006. The core’s scale being validated by the comparison of the predictions with the experimental data of Multon's thesis work, an application at the scale of the Song Loulou dam was undertaken. The computation of the thermo-chemo-mechanical behavior of a spillway stack, whose results in displacement could be compared with the auscultation data provided by the company AES-SONEL (now ENEO), was realized. Surrogate models were then constructed at the scale of the structure to obtain displacements at the points of interest, related to the operating limit states of the spillways, and thus to estimate the residual reliability of the dam. The sensitivity analysis computations as well as the construction of the surrogate models were implemented in Fortran, Java and OpenTURNS. Computations on concrete cores and Song Loulou dam spillway were performed under Cast3M.
In this heavily annotated article the provocative thesis is submitted that there is something fundamentally wrong with Western Sinology, or 'Chinakunde', or 'Zhongguoxue' (as distinct from 'Hanxue', which is a kind of old-fashioned... more
In this heavily annotated article the provocative thesis is submitted that there is something fundamentally wrong with Western Sinology, or 'Chinakunde', or 'Zhongguoxue' (as distinct from 'Hanxue', which is a kind of old-fashioned philology). 'China experts' either pretend to be knowledgeable about everything related to China, in which case they cannot be taken seriously, or - eventually - admit not to be scientific all-rounders with respect to the country, in which case they cannot be called 'China experts'. The author, who graduated in Sinology from Leyden University and in economics from Erasmus University Rotterdam, not only believes that the study of China has to be taken to a higher level (a belief he expects few tenured professors of Chinese Studies/History will share); he also explains how this long overdue task can be accomplished. Sinologists should take the complexity turn. They should treat China as a 'Ganzheit' (not: 'Gesamtheit'), as a territory-bound, history-moulded and culture-soaked totality of identifiable yet interdependent (f)actors, as a whole intimately interconnected with its numerous parts, as a hypercomplex system of complex, adaptive and non-linear systems of political, military, legal, economic, financial, social, medical, educational, artistic or other nature. Firmly distancing itself from multidisciplinary research (which in practice is a matter of juxta- rather than composition), the new study of China requires a well-thought-out, balanced division of labour. Close collaboration with ICT-driven, China-oriented experts in the natural, social and human sciences willing to co-operate with each other is a sine qua non for comprehending the country that seems to be moving to the centre stage of world politics. The study of China should be mile-wide and mile-deep. The heyday of Sinology is yet to come!
An earlier version of this highly critical but undeniably constructive paper was rejected out of hand by the editors of leading 'Chinese/Asian Studies' journals. The author claims to have reason to suspect them (and other so-called China experts) of being 'bought by China'!
Since the late 1980s, there has been a growing interest in the use of foundational ontologies to provide a sound theoretical basis for the discipline of conceptual modeling. This has led to the development of ontology-based conceptual... more
Since the late 1980s, there has been a growing interest in the use of foundational ontologies to provide a sound theoretical basis for the discipline of conceptual modeling. This has led to the development of ontology-based conceptual modeling techniques whose modeling primitives reflect the conceptual categories defined in a foundational ontology. The ontology-based conceptual modeling language OntoUML, for example, incorporates the distinctions underlying the taxonomy of types in the Unified Foundational Ontolo-gy (UFO) (e.g., kinds, phases, roles, mixins, etc.). This approach has focused so far on the support to types whose instances are individuals in the subject domain, with no provision for types of types (or categories of categories). In this paper we address this limitation by extending the Unified Foundational Ontology with the MLT multi-level theory. The UFO-MLT combination serves as a foundation for conceptual models that can benefit from the ontological distinctions of UFO as well as MLT's basic concepts and patterns for multi-level modeling. We discuss the impact of the extended foundation to multi-level conceptual modeling.
In today's global and complex business environment, security is a major issue for any organization. All organizations should have the capability to plan and respond to incidents and business disruptions. Business continuity management is... more
In today's global and complex business environment, security is a major issue for any organization. All organizations should have the capability to plan and respond to incidents and business disruptions. Business continuity management is part of information security management and the process of Business continuity management (BCM) can meet these needs. Indeed, Business Continuity refers to the ability of a business to continue its operations even if some sort of failure or disaster occurs. Business continuity management (BCM) requires a holistic approach that considers technological and organizational aspects. Besides, Enterprise architecture (EA) is a comprehensive view of organizational architecture, business, and technology architecture and their relationships. EA is also considered by several studies as a foundation for BC and security management. Our research aims at studying how BCM aspect can be embedded into the enterprise architecture. In this sense, this paper proposes a metamodel and an implementation method that considers BC in the design and implementation of EA.
By consideration of scientific paradigm shifts, in this chapter the authors evaluate possible parallels in the evolution of modelling, and particularly metamodelling and modelling language construction, as a basis for evaluating whether... more
By consideration of scientific paradigm shifts, in this chapter the authors evaluate possible parallels in the evolution of modelling, and particularly metamodelling and modelling language construction, as a basis for evaluating whether or not the time is ripe for a similar change of direction in model language development for software engineering. Having identified several inconsistencies and paradoxes in the current orthodoxy, they then introduce a number of ideas from outside software engineering (including language use, philosophy, and ontology engineering) that seem to solve many of these issues. Whether these new ideas, together, are sufficient to create a shift in mindset or whether they are simply the stimulus for others to create new and orthogonal ideas remains to be seen. The authors urge the modelling and metamodelling communities to search out that new orthodoxy (i.e. instigate a paradigm shift) that will, necessarily, ensure that the science will offer simpler and more satisfying solutions in the years to come.
In this paper the thesis is submitted that the study of China (indeed of each and every country) is wrongheaded in its methodology and should be fundamentally redesigned. Readers are kindly requested to notice that the author... more
In this paper the thesis is submitted that the study of China (indeed of each and every country) is wrongheaded in its methodology and should be fundamentally redesigned. Readers are kindly requested to notice that the author distinguishes between two kinds of 'China experts': 1) Those (tacitly) claiming to be knowledgeable about everything related to China, a - thank God - dying but by no means extinct breed of scholars mainly active in Europe; and 2) Those not CLAIMING to be China experts (that is, those eventually admitting not to be scientific all-rounders with respect to the country) but - and herein lies the crux of the matter - not shying away from entering, without announcement or permission, a domain outside their own field of expertise, SHOULD THEY HAVE ONE! Many American and other 'China experts' can be blamed for this. 'China expert' is a misnomer, grossly (and dangerously) misleading politicians, captains of industry and the general public, as the author argues at length in the paper, which sums up a great deal of his research over the last few years. Scientists (e.g. demographers, economists, sociologists, psychologists, political scientists, jurists, linguists, literary theorists/critics and 'Kunst- und Kulturwissenschaftler') may have research interest in a particular, their discipline related aspect of China; they can never be justifiably called 'China experts'. Throughout the article, particularly in the text under the subheading Scientific Collaboration, the necessity (and possibility!) of academic teamwork is emphasised, that is to say, the need for interdisciplinary (as distinct from multidisciplinary) research of China. The political, legal, military, economic, financial, social, educational, ecological, religious and other kindred affairs of China are inextricably intertwined, forming together a complex system of complex systems. Consequently, the country should be studied (as it has never been done!) by specialists closely working together and being aware of the historical as well as the geographical dimension of their 'material objects' (explananda). Briefly, the paper pleads for gaining EMPIRICALLY AND THEORETICALLY BASED knowledge of CHINA AS SUCH.
The Internet of Things (IoT) is a large-scale complex networked cyberphysical system in which the Smart Objects (SOs) will be the fundamental building blocks. Although, many research efforts in the IoT realm have been to date devoted to... more
The Internet of Things (IoT) is a large-scale complex networked cyberphysical system in which the Smart Objects (SOs) will be the fundamental building blocks. Although, many
research efforts in the IoT realm have been to date devoted to device, networking and application service perspectives, software engineering approaches for the development of IoT systems are still in their infancy. This paper introduces a novel software
engineering approach aiming to support a systematic development of SOs-based systems. The proposed approach is based on metamodels that are defined at different levels of abstraction to support the development phases of analysis, design and
implementation. The effectiveness of the proposed approach is demonstrated through a simple yet effective case study, showing the development of a smart office SO from the high-level design to its agent-based implementation.
One of the advantages of following a MDA-based approach in the development of interactive applications is the possibility of generating multiple platform-specific user interfaces (UI) from the same platform independent UI model. However,... more
One of the advantages of following a MDA-based approach in the development of interactive applications is the possibility of generating multiple platform-specific user interfaces (UI) from the same platform independent UI model. However, the effort required to create the UI model may be significant. In the case of data-intensive applications, a large part of the UI structure and functionality is closely related with the structure and functionality of the domain entities described in the domain model, and the access rules specified in the use case model. This paper presents an approach to reduce the effort required to create platform independent UI models for data intensive applications, by automatically generating an initial UI model from domain and use case models. For that purpose, UML-aligned metamodels for domain and use case models are defined, together with a MOF-based metamodel for user interface models. The transformation rules that drive the UI model generation are introduced. It is also proposed a MDA-based process for the development of data intensive interactive applications based on the proposed model architecture and transformations.
This study aims to find the best cross-sectional shapes of thin-walled columns enduring an oblique impact loading for crashworthiness. For approximating to the shape, spline polynomials are used with four key-points benefiting from the... more
This study aims to find the best cross-sectional shapes of thin-walled columns enduring an oblique impact loading for crashworthiness. For approximating to the shape, spline polynomials are used with four key-points benefiting from the double symmetry of the cross section. Crashworthiness is defined by using a multi-objective function. By using Latin hypercubes design of experiment methodology, the design space is sampled. Based on the finite elements analyses, the objective functions are approximated by adopting radial basis function network. The corresponding Pareto front is found by Non-dominated Sorting Genetic Algorithm II. It is found that plus-sign-like cross-sections have better performance than benchmarks for all objectives.
In philosophy, the term ontology has been used since the 17 century to refer both to a philosophical discipline (Ontology with a capital “O”), and as a domain-independent system of categories that can be used in the conceptualization of... more
In philosophy, the term ontology has been used since the 17 century to refer both to a
philosophical discipline (Ontology with a capital “O”), and as a domain-independent system of categories that can be used in the conceptualization of domain-specific scientific theories. In the past decades there has been a growing interest in the subject of ontology in computer and information sciences. In the last few years, this interest has expanded considerably in the context of the Semantic Web and MDA (Model-Driven Architecture) research efforts, and due to the role ontologies are perceived to play in these initiatives. In this paper, we explore the relations between Ontology and ontologies in the philosophical sense with domain ontologies in computer science. Moreover, we elaborate on formal characterizations for the notions of ontology, conceptualization and metamodel, as well as on the relations between these notions. Additionally, we discuss a set of criteria that a modeling language should meet in order to be considered a suitable language to model phenomena in a given domain, and present a systematic framework for language evaluation and design. Furthermore, we argue for the importance of ontology in both philosophical senses aforementioned for designing and evaluating a suitable general ontology representation language, and we address the question whether the so-called Ontology Web languages can be considered as suitable general ontology representation languages. Finally, we motivate the need for two complementary classes of modeling languages in Ontology Engineering addressing two separate sets of concerns.
Disaster Management (DM) is a diffused area of knowledge. It has many complex features interconnecting the physical and the social views of the world. Many international and national bodies create knowledge models to allow knowledge... more
Disaster Management (DM) is a diffused area of knowledge. It has many complex features interconnecting the physical and the social views of the world. Many international and national bodies create knowledge models to allow knowledge sharing and effective DM activities. But these are often narrow in focus and deal with specified disaster types. We analyze thirty such models to uncover that many DM activities are actually common even when the events vary. We then create a unified view of DM in the form of a metamodel. We apply a metamodelling process to ensure that this metamodel is complete and consistent. We validate it and present a representational layer to unify and share knowledge as well as combine and match different DM activities according to different disaster situations.
- by Ghassan Beydoun and +1
- •
- Information Systems, Management, Metamodelling, Disaster Management
Hospitals are critical infrastructures which are vulnerable to natural disasters, such as earthquakes, man-made disasters and mass causalities events. During the emergency, the hospital might also incur in structural and non-structural... more
Hospitals are critical infrastructures which are vulnerable to natural disasters, such as earthquakes, man-made disasters and mass causalities events. During the emergency, the hospital might also incur in structural and non-structural damage, have limited communication and resources, so they might not be able to treat the large number of incoming patients. For this reason, the majority of medium-and large-size hospitals have an emergency plan that expands their services quickly beyond normal operating conditions to meet an increased demand for medical care, but it is impossible for them to test it before an emergency occurs. The objective of this article is to develop a simplified model that could describe the ability of the Hospital Emergency Department to provide service to all patients after a natural disaster or any other emergency. The waiting time is the main response parameter used to measure hospital resilience to disasters. The analytical model has been built using the following steps. First, a discrete event simulation model of the Emergency Department in a hospital located in Italy is developed taking into account the hospital resources, the emergency rooms, the circulation patterns and the patient codes. The results of the Monte Carlo simulations show that the waiting time for yellow codes, when the emergency plan is applied, are reduced by 96%, while for green codes by 75%. Then, using the results obtained from the simulations, a general metamodel has been developed, which provides the waiting times of patients as function of the seismic input and the number of the available emergency rooms. The proposed metamodel is general and it can be applied to any type of hospital.
A number of new techniques which improve the efficiency of random sampling-high dimensional model representation (RS-HDMR) is presented. Comparison shows that quasi Monte Carlo based HDMR (QRS-HDRM) significantly outperforms RS-HDMR.... more
A number of new techniques which improve the efficiency of random sampling-high dimensional model representation (RS-HDMR) is presented. Comparison shows that quasi Monte Carlo based HDMR (QRS-HDRM) significantly outperforms RS-HDMR. RS/QRS-HDRM based methods also show faster convergence than the Sobol method for sensitivity indices calculation. Numerical tests prove that the developed methods for choosing optimal orders of polynomials and the number of sampled points are robust and efficient.
The purpose of a metamodel in web engineering methodologies is for the platform independent analysis as well as the design of the content, navigation, and presentation issues of web applications. In the previous years, numbers of... more
The purpose of a metamodel in web engineering methodologies is for the platform independent analysis as well as the design of the content, navigation, and presentation issues of web applications. In the previous years, numbers of methodologies for the development of web applications were proposed, and most of them defined their notation for creating metamodels. The increasing expansion and complexity of web applications are a new challenge for web software developers. This paper presents a comparison study between metamodel of the three methodologies which are; UML-Based Web Engineering (UWE), Web Modeling Language (WebML) and Object Oriented Hypermedia (OOH). The aim is to show the capability of the methodologies to address the challenges in developing the web applications. The evaluation results presented in this paper help the designer in providing initial knowledge of the strengths and weaknesses of the three methodologies for developing web applications.
In several subject domains, the categorization scheme itself is part of the subject matter. In these subject domains, ex- perts make use of categories of categories in their accounts. This has led to a number of approaches in conceptual... more
In several subject domains, the categorization scheme itself is part of the subject matter. In these subject domains, ex- perts make use of categories of categories in their accounts. This has led to a number of approaches in conceptual mod- eling and knowledge representation that are called multi- level modeling approaches. An early approach for multi- level modeling is the powertype pattern which introduces “power types” and “base types”. More recently, other pro- posals for multi-level modeling include “clabjects”, “m- objects”, which admit the existence of entities, which are somehow, simultaneously, types (classes) and instances (usually associated to objects). Regardless of the choice of approach to perform multi-level modelling, a question re- mains concerning the ontological status of “base types”, “power types” and “clabjects”. This paper aims to address this gap through an ontological analysis. We use here the general term powertype to generally refer to types whose in- stances exhibit somehow both type-like and instance-like characteristics. We examine alternative accounts for powertype instances: (i) powertype instances as universals (abstract repeatable entities), (ii) powertype instances as mereological sums of instances of an associated type and (iii) powertype instances as variable embodiments. We conclude that the latter is the most promising account for an ontological interpretation of this phenomenon that meets the modelling desiderata for powertypes present in the litera- ture.
This study makes two contributions to mathematics education: a study of teachers and technology and a methodology that investigates technological interventions. The study included (1) a discussion of contextual problems and design... more
This study makes two contributions to mathematics education: a study of teachers and technology and a methodology that investigates technological interventions. The study included (1) a discussion of contextual problems and design principles, (2) the design of an online survey tool (i.e., Student Response Form), (3) the implementation of the tool in a variety of mathematics classrooms, and (4) a description of teachers' perspectives on the impact of using the tool on their practice and their students. In general, teachers perceived the Student Response Form to amplify existent practices due to increased student feedback. In addition, teachers reported an increase in classroom interactions. This study explored the use of a non-conventional method of inquiry called method assemblage that was significantly influenced by the philosophical work of Gilles Deleuze and Félix Guattari. The study is offered as a basic prototype for investigating innovative uses of technology. This method attempts to maintain the tensions of rigor and unexpected outcomes.
Code generation is an important part of Model-Driven Development as well as the abstract representation of the software functionality. One of the strong deficiencies in this paradigm is the lack of research projects regarding to code... more
Code generation is an important part of Model-Driven Development as well as the abstract representation of the software functionality. One of the strong deficiencies in this paradigm is the lack of research projects regarding to code generation that uses the terminology applied in the industry. In order to ameliorate this scenario, in this work is presented a metamodel proposal for a Web development code generation tool. This proposal is based on the terminology used in the industry for Web development in the region of Sinaloa, México. This is an extension and updated explanation of our work previously presented in a conference.
Here is a draft of a proposal for a crypto token concept. The short version is that the token's quote would be a realtime mirror of our capacity to ensure our growth and survival odds as a specie. This capacity could become an objective... more
Here is a draft of a proposal for a crypto token concept. The short version is that the token's quote would be a realtime mirror of our capacity to ensure our growth and survival odds as a specie. This capacity could become an objective asset built from underlying positive and negative indicators as well.
In the model-based development context, metamodel-based languages are increasingly being defined and adopted either for general purposes or for specific domains of interest. However, meta-languages such as the MOF (Meta Object... more
In the model-based development context, metamodel-based languages are increasingly being defined and adopted either for general purposes or for specific domains of interest. However, meta-languages such as the MOF (Meta Object Facility)—combined with the OCL (Object Constraint Language) for expressing constraints—used to specify metamodels focus on structural and static semantics but have no built-in support for specifying behavioral semantics. This paper introduces a formal semantic framework for the definition of the semantics of metamodel-based languages. Using metamodelling principles, we propose several techniques, some based on the translational approach while others based on the weaving approach, all showing how the Abstract State Machine formal method can be integrated with current metamodel engineering environments to endow language metamodels with precise and executable semantics. We exemplify the use of our semantic framework by applying the proposed techniques to the OMG metamodelling framework for the behaviour specification of the Finite State Machines provided in terms of a metamodel.
Liquid flow and level control are essential requirements in various industries, such as paper manufacturing, petrochemical industries, waste management, and others. Controlling the liquids flow and levels in such industries is challenging... more
Liquid flow and level control are essential requirements in various industries, such as paper manufacturing, petrochemical industries, waste management, and others. Controlling the liquids flow and levels in such industries is challenging due to the existence of nonlinearity and modeling uncertainties of the plants. This paper presents a method to control the liquid level in a second tank of a coupled-tank plant through variable manipulation of a water pump in the first tank. The optimum controller parameters of this plant are calculated using radial basis function neural network metamodel. A time-varying nonlinear dynamic model is developed and the corresponding linearized perturbation models are derived from the nonlinear model. The performance of the developed optimized controller using metamodeling is compared with the original large space design. In addition, linearized perturbation models are derived from the nonlinear dynamic model with time-varying parameters.
This chapter presents a surrogate based approach for system reliability analysis of earth slopes considering random soil properties under the framework of limit equilibrium method of slices. The support vector machine regression (SVR)... more
This chapter presents a surrogate based approach for system reliability analysis of earth slopes considering random soil properties under the framework of limit equilibrium method of slices. The support vector machine regression (SVR) model is employed as a surrogate to approximate the limit-state function based on the Bishop's simplified method coupled with a nonlinear programming technique of optimization. The value of the minimum factor of safety and the location of the critical slip surface are treated as the output quantities of interest. Finally, Monte Carlo simulation in combination with Latin hypercube sampling is performed via the SVR model to estimate the system failure probability of slopes. Based on the detailed results, the performance of the SVR based proposed procedure seems very promising in terms of accuracy and efficiency.
Since the late 1980s, there has been a growing interest in the use of foundational ontologies to provide a sound theoretical basis for the discipline of conceptual modeling. This has led to the development of ontology-based con- ceptual... more
Since the late 1980s, there has been a growing interest in the use of foundational ontologies to provide a sound theoretical basis for the discipline of conceptual modeling. This has led to the development of ontology-based con- ceptual modeling techniques whose modeling primitives reflect the conceptual categories defined in a foundational ontology. The ontology-based conceptual modeling language OntoUML, for example, incorporates the distinctions under- lying the taxonomy of types in the Unified Foundational Ontology (UFO) (e.g., kinds, phases, roles, mixins etc.). This approach has focused so far on the sup- port to types whose instances are individuals in the subject domain, with no provision for types of types (or categories of categories). In this paper we ad- dress this limitation by extending the Unified Foundational Ontology with the MLT multi-level theory. The UFO-MLT combination serves as a foundation for conceptual models that can benefit from the ontological distinctions of UFO as well as MLT’s basic concepts and patterns for multi-level modeling. We dis- cuss the impact of the extended foundation to multi-level conceptual modeling.
A black box model is a numerical simulation that is used in optimization. It is computationally expensive, so it is convenient to replace it with surrogate models obtained by simulating only a few points and then approximating the... more
A black box model is a numerical simulation that is used in optimization. It is computationally expensive, so it is convenient to replace it with surrogate models obtained by simulating only a few points and then approximating the original black box. Here, a recent approach, using Symbolic Regression via Genetic Programming, is compared experimentally to neural network based surrogate models, using test functions and electromagnetic models. The accuracy of the model obtained by Symbolic Regression is proved to be good, and the interpretability of the function obtained is useful in reducing the optimization's search space.
The purpose of a metamodel in web engineering methodologies is for the platform independent analysis as well as the design of the content, navigation, and presentation issues of web applications. In the previous years, numbers of... more
The purpose of a metamodel in web engineering methodologies is for the platform independent analysis as well as the design of the content, navigation, and presentation issues of web applications. In the previous years, numbers of methodologies for the development of web applications were proposed, and most of them defined their notation for creating metamodels. The increasing expansion and complexity of web applications are a new challenge for web software developers. This paper presents a comparison study between metamodel of the three methodologies which are; UML-Based Web Engineering (UWE), Web Modeling Language (WebML) and Object Oriented Hypermedia (OOH). The aim is to show the capability of the methodologies to address the challenges in developing the web applications. The evaluation results presented in this paper help the designer in providing initial knowledge of the strengths and weaknesses of the three methodologies for developing web applications.
This paper concerns the metamodel of the XTT rule-based knowledge representation. XTT is a knowledge representation and design method which aims at combining decision trees and decision tables. As the pure XTT method required dedicated... more
This paper concerns the metamodel of the XTT rule-based knowledge representation. XTT is a knowledge representation and design method which aims at combining decision trees and decision tables. As the pure XTT method required dedicated tools to design systems, the UML representation has been developed. The abstract syntax of UML is defined by the UML metamodel. This paper introduces the metamodel proposal for the developed UML representation.
- by Krzysztof Kluza
- •
- Metamodelling, HeKatE, MOF
Nowadays the number of software components that coexist within a motor vehicle can be counted by hundreds, however in the automotive industry the software development process is neither as mature nor evolved as the vehicle manufacturing... more
Nowadays the number of software components that coexist within a motor vehicle can be counted by hundreds, however in the automotive industry the software development process is neither as mature nor evolved as the vehicle manufacturing process, where there are well defined methods and practices to reuse components at design and architectural level. In this paper, authors propose a metamodel, developed with software engineering techniques, aimed to offer an alternative that could allow to the expert engineers involved on the development of software components for automobile vehicles to reuse software components related with the driving assistance systems ABS; ASR, ESP and Traction Control.
The application of data mining techniques for obtaining knowledge has historically required the intervention of experts to obtain satisfactory results. This paper presents a solution proposal for the complex topic of identifying the... more
The application of data mining techniques for obtaining knowledge has historically required the intervention of experts to obtain satisfactory results. This paper presents a solution proposal for the complex topic of identifying the requirements of non-expert users when trying to perform data mining techniques. The modeling language for objective-oriented requirements analysis i * (i star) has been used to facilitate the use of a taxonomy of requirements. As a result, it is intended that nonexpert users can represent their requirements without having relevant knowledge of data mining techniques. The application in a case study allows as proof of concept, validate the proposed model.
Context modeling is often used to relate the context in which a system will operate to the entities of interest in the problem domain. It remains the case that context models are inadequate in emerging computing paradigms (e.g., smart... more
Context modeling is often used to relate the context in which a system will operate to the entities of interest in the problem domain. It remains the case that context models are inadequate in emerging computing paradigms (e.g., smart spaces and the Internet of Things), in which the relevance of context is shaped dynamically by the changing needs of users. Formal models are required to fuse and interpret contextual information obtained from heterogeneous sources. Here, we propose an integrated and formal context modeling approach for intelligent systems operating in context-sensitive environments. We introduce a goal-driven, entity-centered identification method for determining which context elements are influential in adapting the system behavior. We then describe a four-layered framework for metamodeling the identification and management of context. First, the framework presents a formal metamodel of context. A formalization of context using first-order logic with relational operators is then presented to specify formally the context information at different abstraction levels. The metamodel therefore prepares the ground for building a formal modeling language and automated support tool (https://github.com/metamodeler/CIM-CSS/). The proposed model is then evaluated using an application scenario in the smart meeting rooms domain, and the results are analyzed qualitatively.
In the real world of engineering problems, in order to reduce optimization costs in physical processes, running simulation experiments in the format of computer codes have been conducted. It is desired to improve the validity of... more
In the real world of engineering problems, in order to reduce optimization costs in physical processes, running simulation experiments in the format of computer codes have been conducted. It is desired to improve the validity of simulation-optimization results by attending the source of variability in the model's output(s). Uncertainty can increase complexity and computational costs in Designing and Analyzing of Computer Experiments (DACE). In this state-of the art review paper, a systematic qualitative and quantitative review is implemented among Metamodel Based Robust Simulation Optimization (MBRSO) for black-box and expensive simulation models under uncertainty. This context is focused on the management of uncertainty, particularly based on the Taguchi worldview on robust design and robust optimization methods in the class of dual response methodology when simulation optimization can be handled by surrogates. At the end, while both trends and gaps in the research field are highlighted, some suggestions for future research are directed.
The paper presents a multi-fidelity global metamodel for expensive computer simulations, developed as an essential part of efficient simulation-based design optimization under uncertainty. High- and low-fidelity solvers are managed... more
The paper presents a multi-fidelity global metamodel for expensive computer simulations, developed as an essential part of efficient simulation-based design optimization under uncertainty. High- and low-fidelity solvers are managed through a multi-fidelity adaptive sampling procedure. The multifidelity approximation is built as the sum of a low-fidelity-trained metamodel and the metamodel of the difference (error) between high- and low-fidelity simulations. The metamodels are based
on dynamic stochastic radial basis functions, which provide the prediction along with the associated uncertainty. New training points are placed where the prediction uncertainty is maximum. The prediction uncertainty of both the low-fidelity and the error metamodel is considered for the adaptive refinement of the low- and high-fidelity training set, respectively. The method is demonstrated through three analytical test problems and one simple industrial application in ship hydrodynamics. The fitting error of the multi-fidelity metamodel is used as evaluation metric. The comparison with a high-fidelity-trained metamodel shows the effectiveness of the present method.