Amarendra Chaudhuri | Indian Institute of Engineering Science and Technology, Shibpur (original) (raw)
Papers by Amarendra Chaudhuri
Physics is considered to be the most basic of the natural sciences. It deals with the fundamental... more Physics is considered to be the most basic of the natural sciences. It deals with the fundamental constituents of matter and their interactions as well as the nature of atoms and the build-up of molecules and condensed matter. It tries to give unified descriptions of the behavior of matter as well as of radiation, covering as many types of phenomena as possible. In some of its applications, it comes close to the classical areas of chemistry, and in others there is a clear connection to the phenomena traditionally studied by astronomers. Present trends are even pointing toward a closer approach of some areas of physics and microbiology.
The creation of economic institutions that can function well under the impact of substantial risk... more The creation of economic institutions that can function well under the impact of substantial risks is analogous to the dilemmas confronting in the face of large-scale ecolog-ical uncertainties. The ultimate solution was not the development of technology that could ride out repeated catastrophe, but rather the invention culturally-adapted ecosystems constructed so as to maximize food yield and minimize risks of famine and adherence to human development indices. Ecosystems permitted a transition to village, city, and larger scale human communities. From that perspective, our current boom-and-bust `globalized' economic structure might be seen as primitive indeed, and may not permit long-term maintenance of current levels of human population, with all that implies. Recent advances in evolutionary theory applied to economic structure and process may permit construction of both new economic theory and new tools for data analysis that can help in the design of more robust economic policies and practices, resulting in less frequent and less disruptive transitions, and enabling as well the designing of institutions that are, in turn, less disrupted. In brief this article deals with the dynamics of social entities of appeasing tribal land for discriminating development. A central and repeated feature of this work, however, is a cognitive paradigm with the embedding environment, producing a complicated intermeshing that confounds simple compartmentalizations of human endeavours.
One of the greatest challenges facing 21st century science is to understand the human brain. If s... more One of the greatest challenges facing 21st century science is to understand the human brain. If science can rise to the challenge, it can gain fundamental insights into what it means to be human, develop new treatments for brain diseases, and build revolutionary new Information and Communications Technologies (ICT). The convergence between ICT and biology has reached a point at which it can turn this dream into reality (The HBP, Lausanne, April 2012).
Financial engineering, the most computationally intense subfield of finance, has only come to be ... more Financial engineering, the most computationally intense subfield of finance, has only come to be
recognized as a formal profession over the last four or five years. During that time, the International
Association of Financial Engineers (IAFE) has been instrumental in helping to define the profession and in
promoting the creation of financial engineering research programs. Technological sciences recently reveals
the revolution in financial services. For more than a half-century statistics and technical analysis have been
the technologies of choice for financial analysts. However, it was not until the introduction of the
Hamiltonian-Jacobi-Bellman and Black-Scholes differential equation in the mid-70’s that more advanced
forms of mathematics were used in the field of finance. Since that time there has been a tremendous
expansion in the application of mathematics and other engineering technologies to the field of finance.
The issue of continuous and controlled drug delivery is important for the treatment of diabetes a... more The issue of continuous and controlled drug delivery is important for the
treatment of diabetes and many other chronic medical conditions. Site-specific drug
delivery is presently adopted by using programmable and implantable drug delivery
devices, which allows for higher drug concentration at the site where it is actually
necessary. In the present paper, linear quadratic control algorithms have been
developed to deliver insulin via an implantable micro-insulin dispenser for blood
glucose regulation in type-1 or insulin dependent diabetes mellitus (IDDM) patients.
The micro-insulin dispenser model and the non-linear physiological model of the patient
have been developed and the combined system is linearized to obtain a 9th order state
space model. The controller performance has been tested to find the ability to track the
normoglycaemic set point of 81.1 mg/dl with effective rejection of 60g oral glucose
tolerance test (OGTT) and exercise disturbances.
Keywords: Glucose insulin interaction, micro-insulin dispenser, LQG control, augmented
system, disturbance rejection.
Abstract The specification of a model almost exclusively involves purely economic consideration... more Abstract
The specification of a model almost exclusively involves purely economic considerations. The model may be used as an aid in economic analysis, policy simulation, or policy optimization, but each case imposes special demands on the specification. The result of such considerations generally determines the overal1 size of the model, the number of inputs and outputs, and the definition of these variables. In addition, the outputs of the model are usually decomposed into two types: the endogenous variables which are outputs of dynamic equations, called behavioral equations and variables which are outputs of non-stochastic equations, called definitional equations. A choice must be made as to the use of variables in current price (inflated) or constant price (deflated). The economic specification stage can be summarized as one in which the following information is determined:
- The specific purpose of the model, thereby fixing the overall size; and hence, an enumeration of all the outputs and their type and an enumeration of all the inputs and their type.
- The output definitions; whether it is explained by a behavioral equation together with all its explanations (inputs to the equation), or, whether it is determined by a definitional identity.
The second stage is the most challenging of the two. This stage combines the use of a priori economic information, hypothesis testing techniques, and cross-correlation analysis from the black box approach. In econometric terminology, the word “structure” denotes a complete definition of the functional or stochastic relationships between all of the endogenous and exogenous variables. The specific meaning of structure can be ascertained by examining each equation of the structural form. Before accepting the results of any estimation, they must be tested for their adequacy. The auto- and cross-correlation functions for the model residuals constitute an important diagnostic check. The last diagnostic to be employed is perhaps the most important,, namely, the model’s forecasting performance. After having successfully met the other diagnostic checks a model is not accepted until it has demonstrated its ability to forecast. Forecasts are then made with each model from the end of its sample period up to the present, using the (historical) observed inputs over this period. Thus, forecasts are obtained outside of the sample period . Such simulations more closely approach reality and serve as a good guide in judging the model’s adequacy in forecasting the unknown future. This gives additional insight into the time-invariance of the model structure. The modeling procedure described in this paper was designed to incorporate three concepts. First, employment is made of all available a priori information provided by thus eliminating beforehand the possibility of expending effort on fruitless searches for non-existent relationships (interconnections), Second, the basic philosophy of the “black box” approach is then applied allowing the data to decide the exact dynamic structure. Thus, overly complex (statistically unsubstantiated) structures are automatically eliminated. Third, diagnostics are continually employed which are designed to both reveal inadequacies and indicate how improvements can be made.
Soft computing is a collection of methodologies that aim to exploit the tolerance for imprecision... more Soft computing is a collection of methodologies that aim to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness, and low solution cost. Its principal constituents are fizzy logic, neurocomputing ,and probabilistic reasoning. Soft computing is likely to play an increasingly important role in many application areas, including financial engineering. The role model for soft computing is the human mind.
The fuzzy set theory provides a guide to and techniques for forecasting, decision making, conclusions, and evaluations in an environment involving uncertainty, vagueness, and impression in business, finance, management, and socio-economic sciences. It encompasses applications in case studies including stock market strategy. The fuzzy membership function (unit share price) for low, medium and high and the corresponding Mumbai Stock Exchange Sensitive Index (BSE SENSEX), (BSE SENSEX) for low, medium and high are described. It has been shown that the unit share price in a dynamically stable market moves along with the sensitive index. The application of fuzzy control algorithms for market management may appear to be a promising domain for further investigation.
Key words: fuzzy set theory, stock market strategy, membership function, relational matrices
This paper deals with an application of the cybernetic method of recursive dynamic least square i... more This paper deals with an application of the cybernetic method of recursive dynamic least square instrument variable algorithm with on line parameter tracking adaptability for on line modelling of short term national market index movement with a time slot of 1-day having interacting variables such as market price indices of market-dominating fundamentals of industrial production. It introduces the concept of the dynamics of Indian stock market based on innovative reasoning which precedes by forming an expectation and verifying it by proper use of informational data set composed of the available knowledge and intuitive observation. The present investigation unfolds the traditions of controls and systems in estimation, identification and exploitation of structures to develop efficient algorithms providing opportunities of significant research in system science. The work presented here formalizes a specific dynamic situation, namely the construction of a finite dimensional process for daily movement of national market index. It has been clearly demonstrated with observed data that the flexibility of the algorithms is remarkably broad. Indeed, it is possible to choose free variables in such a way that the entire formal modelling process can be interpreted as a linear quadratic Gaussian problem.
Key words
Estimation, identification, parameterization, market index, industrials, recursive analysis
Owing to the feedback coupling of the dominant processes of heat release, dynamic instability occ... more Owing to the feedback coupling of the dominant processes of heat release, dynamic instability occurs in
combustion processes. The control of dynamic instability in continuous combustion systems is one of the
most successful applications of control technology in computational fluid mechanics. Dynamic data driven
applications systems (DDDAS) paradigms relating to propulsion processes entail the ability to dynamically
incorporate data into an execution application. Models of combustion instability have been derived using
system-identification-based methods. The heat release from the combustion of reactants alters the heat
release dynamics closing the loop. The most dominant dynamics that are of concern in combustion systems
pertain to the unsteady temperature. This investigation uses input-output data and a system identification
approach to determine the underlying model for which active control strategies can be designed and
implemented experimentally. The process responsible for energizing the temperature oscillations is heat
release. The modelling of heat release dynamics thus constitutes a study of the mechanisms that induce
these fluctuations and their quantification. The present work attempts to obtain dynamic descriptions of
temperature of a space shuttle main engine for development of a real time failure detection algorithms using
multilayer group method of data handling algorithm (GMDH).
Neuromarketing is a relatively new concept which has developed as a consequence of accepting, by ... more Neuromarketing is a relatively new concept which has developed as a consequence of accepting,
by an increasing number of persons, the idea that there isn’t an objective reality and that the
entire world is actually inside our mind, it is the sum of our exclusively subjective
perceptions(Pop Ciprian-Marcel, et al., 2009). The science that studies these aspects at a
biological and theoretical level is neurology. Neurology and marketing have recently “met” in a
series of studies resulted out of curiosity and the desire of knowledge leading to the “birth” of the
neuromarketing term. When tracking brain functions,neuroscientists generally use either
electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) technology.
EEG measures fluctuations in the electrical activity directly below the scalp, which occurs as a
result of neural activity. By attaching electrodes to subjects' heads and evaluating the electrical
patterns of their brain waves, researchers can track the intensity of visceral responses such as
anger, lust, disgust,and excitement.
The work presented in this paper is devoted to the area of computer-aided modelling of daily gree... more The work presented in this paper is devoted to the area of computer-aided modelling of daily green tea leaf
production as a memory type as well as weather dependent state variables with the help of the technique
of self-organization of different layer-wise mathematical descriptions. The composite polynomials obtained
with group method of data handling algorithms (GMDH) provide the conceptual framework to understand
the complex process of weather-dependent daily green tea leaf production.
The creation of economic institutions that can function well under the impact of substantial risk... more The creation of economic institutions that can function well under the impact of substantial risks
is analogous to the dilemmas confronting in the face of large-scale ecolog-ical uncertainties. The
ultimate solution was not the development of technology that could ride out repeated catastrophe,
but rather the invention culturally-adapted ecosystems constructed so as to maximize food yield
and minimize risks of famine and adherence to human development indices. Ecosystems
permitted a transition to village, city, and larger scale human communities. From that
perspective, our current boom-and-bust `globalized' economic structure might be seen as
primitive indeed, and may not permit long-term maintenance of current levels of human
population, with all that implies. Recent advances in evolutionary theory applied to economic
structure and process may permit construction of both new economic theory and new tools for
data analysis that can help in the design of more robust economic policies and practices, resulting
in less frequent and less disruptive transitions, and enabling as well the designing of institutions
that are, in turn, less disrupted. In brief this article deals with the dynamics of social entities of
appeasing tribal land for discriminating development. A central and repeated feature of this
work, however, is a cognitive paradigm with the embedding environment, producing a
complicated intermeshing that confounds simple compartmentalizations of human endeavours.
The river Teesta, fed by the snow and glaciers of Kanchenjungha and great Himalayas, originates i... more The river Teesta, fed by the snow and glaciers of Kanchenjungha and great Himalayas, originates in north Sikkim at an elevation of 8400 m above mean sea level. Glaciers are normally described as a mass of ice slowly moving down a gradient. A glacier consists of ice crystals, water and rock debris. Out of this, ice is an essential part of a glacier. Based upon morphological characteristics of glaciers they can be grouped into classes such as ice sheet, ice cap, and glacier constrained by topography. Ice sheet and ice cap are formed when underlying topography is fully submerged by ice and glacier flow is not influenced by topography. On the other hand, when glaciers are constrained by surrounding topography and the shape of valley influences their flow, then such glacier are classified as valley glaciers, cirque glaciers and ice fields. Sikkim Himalayan glaciers are constrained by topography and are predominantly of valley type.
The control of dynamic instability in continuous combustion systems is one of the most successful... more The control of dynamic instability in continuous combustion systems is one of the most successful applications of control technology in fluid systems. Recent results in this area have shown unequivocally that active control is a feasible technology for reducing unsteady temperature oscillations. Models of combustion instability have been derived using system-identification-based methods. Dynamic instability occurs in combustion processes due to the feedback coupling of the dominant processes of heat release. The heat release from the combustion of reactants alters the heat release dynamics closing the loop. The most dominant dynamics that are of concern in combustion systems pertain to the unsteady temperature. This investigation uses input-output data and a system identification approach to determine the underlying model. Here we describe only those models for which active control strategies can be designed and implemented experimentally. The process responsible for energizing the temperature oscillations is heat release. The modeling of heat release dynamics thus constitutes a study of the mechanisms that induce these fluctuations and their quantification. The modelling approach described in this work is a black-box approach that consists of dynamic modeling using input-output data that represents the response of the overall combustion process. The results obtained using such an approach are described. Development of failure detection algorithms for identification, prediction and control of combustion instability is a major area of research in control sciences. The results of recent theoretical and experimental efforts and the relevant mathematical analysis have opened the possibilities for solution of failure detection problems in combustion processes . The present work attempts to obtain dynamic descriptions of temperature of a space shuttle main engine for development of a real time failure detection algorithms using group method of data handling algorithm and recursive algorithm. The Part I of the work describes the development of the model with GMDH and Part II by the recursive algorithm.
Key words: combustion instability, GMDH algorithm, recursive algorithm, unsteady temperature, space
shuttle main engine
The aircraft models are based on fundamental principles of rigid-body mechanics, aerodynamics, an... more The aircraft models are based on fundamental principles of rigid-body mechanics, aerodynamics, and propulsion .This article describes our experiences in developing system integration issues arising by consideration of aircraft subsystems, including the aircraft structure and geometry, aerodynamic features, and the engine characteristics. In our work the reduced overshoot response is further obtained using Butterworth filter and Diaphantine controller. It has been found that the method fastens to speed up the response of the aircraft as well as the subsequent overshoot. It is hoped that this method can be applied to physical plant of a military aircraft.
Key words: control augmentation, maneuverability, stability augmentation systems, Kharitonov’s Polynomials,
speeding the response
This paper determines how synthesis technique offers improved quality of control laws .It has bee... more This paper determines how synthesis technique offers improved quality of control laws .It has been shown in the paper that when control techniques are combined with good understanding of the control problem, the significant benefits occurred. Stability and control has been major technical challenges of control system design. Slow controller in many cases have directly attributed inadequate solution to the stability and control problem. With introduction of fast acting drive system, for example military aircraft the importance of role of control law synthesis in the design of fast and robust controller has tremendously increased to improve the fast acting behavior of control mechanism. In particular, the forces required to maneuver the aircraft became expensive and beyond pilot’s ability. This led to the fast acting robust controllers. This paper highlights the importance of role of control law synthesis in the design of fast and robust controllers. The paper shows how a classical feedback controller can be used to furnish fast acting characteristic in the face of parametric perturbation.
Key Words: transient response, overshoot, characteristic ratios, generalized time constant, interval
polynomials, robust stability
The determination of closed loop transfer function often results in an eight or higher order syst... more The determination of closed loop transfer function often results in an eight or higher order system of equations with a little or no coupling between the longitudinal or lateral dynamics of an aircraft. For this reason, the longitudinal and lateral dynamics can be decoupled completely in most cases and studied separately. This needs transfer function reduction in linear time-invariant systems and the corresponding control theories have become cornerstones for important new theoretical developments with far-reaching implications. The key computational problem there is the calcula-tion of a balancing transformation and the matrices of the balanced realization. In this paper we shall concentrate on describing numerical algorithms for computing state-space balancing transformations for transfer function reduction. Simple linear controllers are normally preferred over complex linear controllers for linear time-invariant plants. It is therefore necessary to reduce the order of the physical plant transfer function. There are fewer things to go wrong in the hardware or bugs to fix in the software; they are easier to understand; and the computational requirements are less if the order of transfer function is less. A great deal of qualitative/quantitative knowledge exists which is vital in the applications of the design algorithms to practical procedures. Development of controllability and observability grammians is key to reduction procedures. Such procedures are the subject of this paper. MATLAB procedures are extensively used in this work.
Key Words : balancing transformation, controllability ,observability, grammians, model reduction
The convex optimization problems for development of linear time-invariant controllers are more pr... more The convex optimization problems for development of linear time-invariant controllers are more prevalent in practice than was previously thought. Since 1990 many applications have been discovered in areas of automatic control systems of aircraft. The solution methods are reliable enough to be embedded in a computer-aided design or analysis tool, or even a real-time automatic control system. There are also theoretical or conceptual advantages of formulating a problem as a convex optimization problem. Our main goal is to develop a working knowledge of convex optimization, i.e., to develop the skills and background needed to recognize, formulate, and solve convex optimization problems pertaining to aircraft control. In this paper a design method is proposed to solve control system design problems in which a set of multiple closed loop performance specification must be simultaneously satisfied. To utilize this approach all close loop performance specification considered must have the property that they are convex with respect to closed loop system transfer matrix. For close loop performance specification a close loop controller chosen from a set of all linear controllers determined by trial and error such that the specification is satisfied. The transfer matrix of the final system is determined through the convex combination of the transfer matrices of the plant and the controllers. With the close loop transfer matrices given the closed loop controller structures and its gains are solved algebraically. In this paper we established conditions for the existence of a designed parameter inherent in convexity. The experimental verification deals with a problem of pitch control of flight dynamics of a rigid body aircraft.
Key words: convex controller, parameter optimization, transfer matrices, integral square error(ISE), pitch control,
The recent field called neuromarketing applies the tools of neuroscience to determine why we like... more The recent field called neuromarketing applies the tools of neuroscience to determine why we like some products over others. Neuroscience explains how raw brain data is helping researchers open the mysteries of consumer choice. Input concepts contain. Neuroscientists when tracking brain functions generally use either electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) technology. Fluctuations in the electrical activity directly below the scalp is measured by EEG , while blood flow throughout the brain is tracked by fMRI. Studies have shown activity in that brain area can predict the future popularity of an experience or a product. For businesses planning to outsource neuromarketing services, marketing researchers often advise seeking out a firm that was founded by a operational scientist, or one that has a strong science advisory board. This research shows the effect of source certainty that is the level of certainty expressed by a message source-on arguments. In experiments, consumers receive persuasive messages from sources of varying expertise and certainty. Across studies, low expertise sources violate expectancies, stimulate involvement, and promote persuasion when they express certainty, whereas high expertise sources violate expectancies, stimulate involvement, and promote persuasion when they express uncertainty.
When a particle is formed, an anti-particle is formed. Both violently collide and lost. How are w... more When a particle is formed, an anti-particle is formed. Both violently collide and lost. How are we here ?
“According to Guatemalan Mayan vision of the cosmos, every form of life emerges from the same origin or seeds. Some seeds become trees, others flowers, others water, others human beings. Thus each creature is inextricably linked to all others and what one does to a tree affects not only the tree but oneself and other creatures. This inter-relatedness calls for profound respect between people and their Creator, between people and nature, and among people themselves.” OUR JOURNEY GOES ON
Physics is considered to be the most basic of the natural sciences. It deals with the fundamental... more Physics is considered to be the most basic of the natural sciences. It deals with the fundamental constituents of matter and their interactions as well as the nature of atoms and the build-up of molecules and condensed matter. It tries to give unified descriptions of the behavior of matter as well as of radiation, covering as many types of phenomena as possible. In some of its applications, it comes close to the classical areas of chemistry, and in others there is a clear connection to the phenomena traditionally studied by astronomers. Present trends are even pointing toward a closer approach of some areas of physics and microbiology.
The creation of economic institutions that can function well under the impact of substantial risk... more The creation of economic institutions that can function well under the impact of substantial risks is analogous to the dilemmas confronting in the face of large-scale ecolog-ical uncertainties. The ultimate solution was not the development of technology that could ride out repeated catastrophe, but rather the invention culturally-adapted ecosystems constructed so as to maximize food yield and minimize risks of famine and adherence to human development indices. Ecosystems permitted a transition to village, city, and larger scale human communities. From that perspective, our current boom-and-bust `globalized' economic structure might be seen as primitive indeed, and may not permit long-term maintenance of current levels of human population, with all that implies. Recent advances in evolutionary theory applied to economic structure and process may permit construction of both new economic theory and new tools for data analysis that can help in the design of more robust economic policies and practices, resulting in less frequent and less disruptive transitions, and enabling as well the designing of institutions that are, in turn, less disrupted. In brief this article deals with the dynamics of social entities of appeasing tribal land for discriminating development. A central and repeated feature of this work, however, is a cognitive paradigm with the embedding environment, producing a complicated intermeshing that confounds simple compartmentalizations of human endeavours.
One of the greatest challenges facing 21st century science is to understand the human brain. If s... more One of the greatest challenges facing 21st century science is to understand the human brain. If science can rise to the challenge, it can gain fundamental insights into what it means to be human, develop new treatments for brain diseases, and build revolutionary new Information and Communications Technologies (ICT). The convergence between ICT and biology has reached a point at which it can turn this dream into reality (The HBP, Lausanne, April 2012).
Financial engineering, the most computationally intense subfield of finance, has only come to be ... more Financial engineering, the most computationally intense subfield of finance, has only come to be
recognized as a formal profession over the last four or five years. During that time, the International
Association of Financial Engineers (IAFE) has been instrumental in helping to define the profession and in
promoting the creation of financial engineering research programs. Technological sciences recently reveals
the revolution in financial services. For more than a half-century statistics and technical analysis have been
the technologies of choice for financial analysts. However, it was not until the introduction of the
Hamiltonian-Jacobi-Bellman and Black-Scholes differential equation in the mid-70’s that more advanced
forms of mathematics were used in the field of finance. Since that time there has been a tremendous
expansion in the application of mathematics and other engineering technologies to the field of finance.
The issue of continuous and controlled drug delivery is important for the treatment of diabetes a... more The issue of continuous and controlled drug delivery is important for the
treatment of diabetes and many other chronic medical conditions. Site-specific drug
delivery is presently adopted by using programmable and implantable drug delivery
devices, which allows for higher drug concentration at the site where it is actually
necessary. In the present paper, linear quadratic control algorithms have been
developed to deliver insulin via an implantable micro-insulin dispenser for blood
glucose regulation in type-1 or insulin dependent diabetes mellitus (IDDM) patients.
The micro-insulin dispenser model and the non-linear physiological model of the patient
have been developed and the combined system is linearized to obtain a 9th order state
space model. The controller performance has been tested to find the ability to track the
normoglycaemic set point of 81.1 mg/dl with effective rejection of 60g oral glucose
tolerance test (OGTT) and exercise disturbances.
Keywords: Glucose insulin interaction, micro-insulin dispenser, LQG control, augmented
system, disturbance rejection.
Abstract The specification of a model almost exclusively involves purely economic consideration... more Abstract
The specification of a model almost exclusively involves purely economic considerations. The model may be used as an aid in economic analysis, policy simulation, or policy optimization, but each case imposes special demands on the specification. The result of such considerations generally determines the overal1 size of the model, the number of inputs and outputs, and the definition of these variables. In addition, the outputs of the model are usually decomposed into two types: the endogenous variables which are outputs of dynamic equations, called behavioral equations and variables which are outputs of non-stochastic equations, called definitional equations. A choice must be made as to the use of variables in current price (inflated) or constant price (deflated). The economic specification stage can be summarized as one in which the following information is determined:
- The specific purpose of the model, thereby fixing the overall size; and hence, an enumeration of all the outputs and their type and an enumeration of all the inputs and their type.
- The output definitions; whether it is explained by a behavioral equation together with all its explanations (inputs to the equation), or, whether it is determined by a definitional identity.
The second stage is the most challenging of the two. This stage combines the use of a priori economic information, hypothesis testing techniques, and cross-correlation analysis from the black box approach. In econometric terminology, the word “structure” denotes a complete definition of the functional or stochastic relationships between all of the endogenous and exogenous variables. The specific meaning of structure can be ascertained by examining each equation of the structural form. Before accepting the results of any estimation, they must be tested for their adequacy. The auto- and cross-correlation functions for the model residuals constitute an important diagnostic check. The last diagnostic to be employed is perhaps the most important,, namely, the model’s forecasting performance. After having successfully met the other diagnostic checks a model is not accepted until it has demonstrated its ability to forecast. Forecasts are then made with each model from the end of its sample period up to the present, using the (historical) observed inputs over this period. Thus, forecasts are obtained outside of the sample period . Such simulations more closely approach reality and serve as a good guide in judging the model’s adequacy in forecasting the unknown future. This gives additional insight into the time-invariance of the model structure. The modeling procedure described in this paper was designed to incorporate three concepts. First, employment is made of all available a priori information provided by thus eliminating beforehand the possibility of expending effort on fruitless searches for non-existent relationships (interconnections), Second, the basic philosophy of the “black box” approach is then applied allowing the data to decide the exact dynamic structure. Thus, overly complex (statistically unsubstantiated) structures are automatically eliminated. Third, diagnostics are continually employed which are designed to both reveal inadequacies and indicate how improvements can be made.
Soft computing is a collection of methodologies that aim to exploit the tolerance for imprecision... more Soft computing is a collection of methodologies that aim to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness, and low solution cost. Its principal constituents are fizzy logic, neurocomputing ,and probabilistic reasoning. Soft computing is likely to play an increasingly important role in many application areas, including financial engineering. The role model for soft computing is the human mind.
The fuzzy set theory provides a guide to and techniques for forecasting, decision making, conclusions, and evaluations in an environment involving uncertainty, vagueness, and impression in business, finance, management, and socio-economic sciences. It encompasses applications in case studies including stock market strategy. The fuzzy membership function (unit share price) for low, medium and high and the corresponding Mumbai Stock Exchange Sensitive Index (BSE SENSEX), (BSE SENSEX) for low, medium and high are described. It has been shown that the unit share price in a dynamically stable market moves along with the sensitive index. The application of fuzzy control algorithms for market management may appear to be a promising domain for further investigation.
Key words: fuzzy set theory, stock market strategy, membership function, relational matrices
This paper deals with an application of the cybernetic method of recursive dynamic least square i... more This paper deals with an application of the cybernetic method of recursive dynamic least square instrument variable algorithm with on line parameter tracking adaptability for on line modelling of short term national market index movement with a time slot of 1-day having interacting variables such as market price indices of market-dominating fundamentals of industrial production. It introduces the concept of the dynamics of Indian stock market based on innovative reasoning which precedes by forming an expectation and verifying it by proper use of informational data set composed of the available knowledge and intuitive observation. The present investigation unfolds the traditions of controls and systems in estimation, identification and exploitation of structures to develop efficient algorithms providing opportunities of significant research in system science. The work presented here formalizes a specific dynamic situation, namely the construction of a finite dimensional process for daily movement of national market index. It has been clearly demonstrated with observed data that the flexibility of the algorithms is remarkably broad. Indeed, it is possible to choose free variables in such a way that the entire formal modelling process can be interpreted as a linear quadratic Gaussian problem.
Key words
Estimation, identification, parameterization, market index, industrials, recursive analysis
Owing to the feedback coupling of the dominant processes of heat release, dynamic instability occ... more Owing to the feedback coupling of the dominant processes of heat release, dynamic instability occurs in
combustion processes. The control of dynamic instability in continuous combustion systems is one of the
most successful applications of control technology in computational fluid mechanics. Dynamic data driven
applications systems (DDDAS) paradigms relating to propulsion processes entail the ability to dynamically
incorporate data into an execution application. Models of combustion instability have been derived using
system-identification-based methods. The heat release from the combustion of reactants alters the heat
release dynamics closing the loop. The most dominant dynamics that are of concern in combustion systems
pertain to the unsteady temperature. This investigation uses input-output data and a system identification
approach to determine the underlying model for which active control strategies can be designed and
implemented experimentally. The process responsible for energizing the temperature oscillations is heat
release. The modelling of heat release dynamics thus constitutes a study of the mechanisms that induce
these fluctuations and their quantification. The present work attempts to obtain dynamic descriptions of
temperature of a space shuttle main engine for development of a real time failure detection algorithms using
multilayer group method of data handling algorithm (GMDH).
Neuromarketing is a relatively new concept which has developed as a consequence of accepting, by ... more Neuromarketing is a relatively new concept which has developed as a consequence of accepting,
by an increasing number of persons, the idea that there isn’t an objective reality and that the
entire world is actually inside our mind, it is the sum of our exclusively subjective
perceptions(Pop Ciprian-Marcel, et al., 2009). The science that studies these aspects at a
biological and theoretical level is neurology. Neurology and marketing have recently “met” in a
series of studies resulted out of curiosity and the desire of knowledge leading to the “birth” of the
neuromarketing term. When tracking brain functions,neuroscientists generally use either
electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) technology.
EEG measures fluctuations in the electrical activity directly below the scalp, which occurs as a
result of neural activity. By attaching electrodes to subjects' heads and evaluating the electrical
patterns of their brain waves, researchers can track the intensity of visceral responses such as
anger, lust, disgust,and excitement.
The work presented in this paper is devoted to the area of computer-aided modelling of daily gree... more The work presented in this paper is devoted to the area of computer-aided modelling of daily green tea leaf
production as a memory type as well as weather dependent state variables with the help of the technique
of self-organization of different layer-wise mathematical descriptions. The composite polynomials obtained
with group method of data handling algorithms (GMDH) provide the conceptual framework to understand
the complex process of weather-dependent daily green tea leaf production.
The creation of economic institutions that can function well under the impact of substantial risk... more The creation of economic institutions that can function well under the impact of substantial risks
is analogous to the dilemmas confronting in the face of large-scale ecolog-ical uncertainties. The
ultimate solution was not the development of technology that could ride out repeated catastrophe,
but rather the invention culturally-adapted ecosystems constructed so as to maximize food yield
and minimize risks of famine and adherence to human development indices. Ecosystems
permitted a transition to village, city, and larger scale human communities. From that
perspective, our current boom-and-bust `globalized' economic structure might be seen as
primitive indeed, and may not permit long-term maintenance of current levels of human
population, with all that implies. Recent advances in evolutionary theory applied to economic
structure and process may permit construction of both new economic theory and new tools for
data analysis that can help in the design of more robust economic policies and practices, resulting
in less frequent and less disruptive transitions, and enabling as well the designing of institutions
that are, in turn, less disrupted. In brief this article deals with the dynamics of social entities of
appeasing tribal land for discriminating development. A central and repeated feature of this
work, however, is a cognitive paradigm with the embedding environment, producing a
complicated intermeshing that confounds simple compartmentalizations of human endeavours.
The river Teesta, fed by the snow and glaciers of Kanchenjungha and great Himalayas, originates i... more The river Teesta, fed by the snow and glaciers of Kanchenjungha and great Himalayas, originates in north Sikkim at an elevation of 8400 m above mean sea level. Glaciers are normally described as a mass of ice slowly moving down a gradient. A glacier consists of ice crystals, water and rock debris. Out of this, ice is an essential part of a glacier. Based upon morphological characteristics of glaciers they can be grouped into classes such as ice sheet, ice cap, and glacier constrained by topography. Ice sheet and ice cap are formed when underlying topography is fully submerged by ice and glacier flow is not influenced by topography. On the other hand, when glaciers are constrained by surrounding topography and the shape of valley influences their flow, then such glacier are classified as valley glaciers, cirque glaciers and ice fields. Sikkim Himalayan glaciers are constrained by topography and are predominantly of valley type.
The control of dynamic instability in continuous combustion systems is one of the most successful... more The control of dynamic instability in continuous combustion systems is one of the most successful applications of control technology in fluid systems. Recent results in this area have shown unequivocally that active control is a feasible technology for reducing unsteady temperature oscillations. Models of combustion instability have been derived using system-identification-based methods. Dynamic instability occurs in combustion processes due to the feedback coupling of the dominant processes of heat release. The heat release from the combustion of reactants alters the heat release dynamics closing the loop. The most dominant dynamics that are of concern in combustion systems pertain to the unsteady temperature. This investigation uses input-output data and a system identification approach to determine the underlying model. Here we describe only those models for which active control strategies can be designed and implemented experimentally. The process responsible for energizing the temperature oscillations is heat release. The modeling of heat release dynamics thus constitutes a study of the mechanisms that induce these fluctuations and their quantification. The modelling approach described in this work is a black-box approach that consists of dynamic modeling using input-output data that represents the response of the overall combustion process. The results obtained using such an approach are described. Development of failure detection algorithms for identification, prediction and control of combustion instability is a major area of research in control sciences. The results of recent theoretical and experimental efforts and the relevant mathematical analysis have opened the possibilities for solution of failure detection problems in combustion processes . The present work attempts to obtain dynamic descriptions of temperature of a space shuttle main engine for development of a real time failure detection algorithms using group method of data handling algorithm and recursive algorithm. The Part I of the work describes the development of the model with GMDH and Part II by the recursive algorithm.
Key words: combustion instability, GMDH algorithm, recursive algorithm, unsteady temperature, space
shuttle main engine
The aircraft models are based on fundamental principles of rigid-body mechanics, aerodynamics, an... more The aircraft models are based on fundamental principles of rigid-body mechanics, aerodynamics, and propulsion .This article describes our experiences in developing system integration issues arising by consideration of aircraft subsystems, including the aircraft structure and geometry, aerodynamic features, and the engine characteristics. In our work the reduced overshoot response is further obtained using Butterworth filter and Diaphantine controller. It has been found that the method fastens to speed up the response of the aircraft as well as the subsequent overshoot. It is hoped that this method can be applied to physical plant of a military aircraft.
Key words: control augmentation, maneuverability, stability augmentation systems, Kharitonov’s Polynomials,
speeding the response
This paper determines how synthesis technique offers improved quality of control laws .It has bee... more This paper determines how synthesis technique offers improved quality of control laws .It has been shown in the paper that when control techniques are combined with good understanding of the control problem, the significant benefits occurred. Stability and control has been major technical challenges of control system design. Slow controller in many cases have directly attributed inadequate solution to the stability and control problem. With introduction of fast acting drive system, for example military aircraft the importance of role of control law synthesis in the design of fast and robust controller has tremendously increased to improve the fast acting behavior of control mechanism. In particular, the forces required to maneuver the aircraft became expensive and beyond pilot’s ability. This led to the fast acting robust controllers. This paper highlights the importance of role of control law synthesis in the design of fast and robust controllers. The paper shows how a classical feedback controller can be used to furnish fast acting characteristic in the face of parametric perturbation.
Key Words: transient response, overshoot, characteristic ratios, generalized time constant, interval
polynomials, robust stability
The determination of closed loop transfer function often results in an eight or higher order syst... more The determination of closed loop transfer function often results in an eight or higher order system of equations with a little or no coupling between the longitudinal or lateral dynamics of an aircraft. For this reason, the longitudinal and lateral dynamics can be decoupled completely in most cases and studied separately. This needs transfer function reduction in linear time-invariant systems and the corresponding control theories have become cornerstones for important new theoretical developments with far-reaching implications. The key computational problem there is the calcula-tion of a balancing transformation and the matrices of the balanced realization. In this paper we shall concentrate on describing numerical algorithms for computing state-space balancing transformations for transfer function reduction. Simple linear controllers are normally preferred over complex linear controllers for linear time-invariant plants. It is therefore necessary to reduce the order of the physical plant transfer function. There are fewer things to go wrong in the hardware or bugs to fix in the software; they are easier to understand; and the computational requirements are less if the order of transfer function is less. A great deal of qualitative/quantitative knowledge exists which is vital in the applications of the design algorithms to practical procedures. Development of controllability and observability grammians is key to reduction procedures. Such procedures are the subject of this paper. MATLAB procedures are extensively used in this work.
Key Words : balancing transformation, controllability ,observability, grammians, model reduction
The convex optimization problems for development of linear time-invariant controllers are more pr... more The convex optimization problems for development of linear time-invariant controllers are more prevalent in practice than was previously thought. Since 1990 many applications have been discovered in areas of automatic control systems of aircraft. The solution methods are reliable enough to be embedded in a computer-aided design or analysis tool, or even a real-time automatic control system. There are also theoretical or conceptual advantages of formulating a problem as a convex optimization problem. Our main goal is to develop a working knowledge of convex optimization, i.e., to develop the skills and background needed to recognize, formulate, and solve convex optimization problems pertaining to aircraft control. In this paper a design method is proposed to solve control system design problems in which a set of multiple closed loop performance specification must be simultaneously satisfied. To utilize this approach all close loop performance specification considered must have the property that they are convex with respect to closed loop system transfer matrix. For close loop performance specification a close loop controller chosen from a set of all linear controllers determined by trial and error such that the specification is satisfied. The transfer matrix of the final system is determined through the convex combination of the transfer matrices of the plant and the controllers. With the close loop transfer matrices given the closed loop controller structures and its gains are solved algebraically. In this paper we established conditions for the existence of a designed parameter inherent in convexity. The experimental verification deals with a problem of pitch control of flight dynamics of a rigid body aircraft.
Key words: convex controller, parameter optimization, transfer matrices, integral square error(ISE), pitch control,
The recent field called neuromarketing applies the tools of neuroscience to determine why we like... more The recent field called neuromarketing applies the tools of neuroscience to determine why we like some products over others. Neuroscience explains how raw brain data is helping researchers open the mysteries of consumer choice. Input concepts contain. Neuroscientists when tracking brain functions generally use either electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) technology. Fluctuations in the electrical activity directly below the scalp is measured by EEG , while blood flow throughout the brain is tracked by fMRI. Studies have shown activity in that brain area can predict the future popularity of an experience or a product. For businesses planning to outsource neuromarketing services, marketing researchers often advise seeking out a firm that was founded by a operational scientist, or one that has a strong science advisory board. This research shows the effect of source certainty that is the level of certainty expressed by a message source-on arguments. In experiments, consumers receive persuasive messages from sources of varying expertise and certainty. Across studies, low expertise sources violate expectancies, stimulate involvement, and promote persuasion when they express certainty, whereas high expertise sources violate expectancies, stimulate involvement, and promote persuasion when they express uncertainty.
When a particle is formed, an anti-particle is formed. Both violently collide and lost. How are w... more When a particle is formed, an anti-particle is formed. Both violently collide and lost. How are we here ?
“According to Guatemalan Mayan vision of the cosmos, every form of life emerges from the same origin or seeds. Some seeds become trees, others flowers, others water, others human beings. Thus each creature is inextricably linked to all others and what one does to a tree affects not only the tree but oneself and other creatures. This inter-relatedness calls for profound respect between people and their Creator, between people and nature, and among people themselves.” OUR JOURNEY GOES ON
Nobel Prizes in Physics 1901-2017 : A Brief Review of Radical and Strange Insights of the Inquir... more Nobel Prizes in Physics 1901-2017 : A Brief Review of Radical and Strange Insights of the Inquiring Human Spirit
Nobel Prizes in Physics 1901-2016 : A Brief Review of Radical and Strange Insights of the Inquir... more Nobel Prizes in Physics 1901-2016 : A Brief Review of Radical and Strange Insights of the Inquiring Human Spirit
The quantum technology has the potential to be a large industry in the future, comparable to the ... more The quantum technology has the potential to be a large industry in the future, comparable to the current market for electronics, which is currently a enormous, worldwide market. For the economy it will be a game changer. Confidence and persistence is needed, as it is a slow process, but the reward is potentially very high. A classic example is Silicon Valley in the USA, a result of a long but deeply transforming process. Innovation in quantum technologies helps companies to be more pioneering in their approach to business, by updating their production lines or using new technologies to improve their efficiency, or as a basis for new products and services.
Poverty alleviation is the vital challenge facing humanity. BPO and shifting of manufacturing and... more Poverty alleviation is the vital challenge facing humanity. BPO and shifting of manufacturing and industrial facilities to poorer countries could only meet such challenges. The boom in outsourcing across all market sectors is fed by the realizations that organizations cannot excel in all areas, and will benefit from entrusting components of their operation to external agencies specializing in that field. A successful outsource should support an organization's goals to improve profitability and market positioning, not just in terms of a reduced cost base, but also through focusing activity on driving new revenues. Many Indian companies act outsourcing or service partners to a number of their customers. They provide network connectivity within the EU, US and offshore to support complex call centre infrastructure and sources the customer management service from a specialist agency. The key to successful outsourcing is always the same – transparent and honest relationships where key services are defined and backed by strong service guarantees. The Rupee is rising. World capital is turning toward India. We should not fritter away this golden opportunity. We should use this capital in the right way so that we are able to maintain ourselves as No. 1 economy in perpetuity, heralding a quality of life to our poverty stricken people in right earnest.
The work presented in this paper is devoted to the area of computer-aided modelling of daily gree... more The work presented in this paper is devoted to the area of computer-aided modelling of daily green tea leaf production as a memory type as well as weather dependent state variables with the help of the technique of self-organization of different layer-wise mathematical descriptions. The composite polynomials obtained with group method of data handling algorithms (GMDH) provide the conceptual framework to understand the complex process of weather-dependent daily green tea leaf production.