Mathematical Sciences Research Papers - Academia.edu (original) (raw)

In this paper, an algorithm based on unstructured triangular meshes using standard refinement patterns for anisotropic adaptive meshes is presented. It consists of three main actions: anisotropic refinement, solution-weighted smoothing... more

In this paper, an algorithm based on unstructured triangular meshes using standard refinement patterns for anisotropic adaptive meshes is presented. It consists of three main actions: anisotropic refinement, solution-weighted smoothing and patch unrefinement. Moreover, a hierarchical mesh formulation is used. The main idea is to use the error and error gradient on each mesh element to locally control the anisotropy of the mesh. The proposed algorithm is tested on interpolation and boundary-value problems with a discontinuous solution.

Synchronization thresholds of an array of nondiagonally coupled oscillators are investigated. We present experimental results which show the existence of ragged synchronizability, i.e., the existence of multiple disconnected... more

Synchronization thresholds of an array of nondiagonally coupled oscillators are investigated. We present experimental results which show the existence of ragged synchronizability, i.e., the existence of multiple disconnected synchronization regions in the coupling parameter space. This phenomenon has been observed in an electronic implementation of an array of nondiagonally coupled van der Pol's oscillators. Numerical simulations show good agreement with the experimental observations.

A new mechanism of fracture toughness enhancement in nanocrystalline metals and ceramics is suggested. The mechanism represents the cooperative grain boundary (GB) sliding and stress-driven GB migration process near the tips of growing... more

A new mechanism of fracture toughness enhancement in nanocrystalline metals and ceramics is suggested. The mechanism represents the cooperative grain boundary (GB) sliding and stress-driven GB migration process near the tips of growing cracks. It is shown that this mechanism can increase the critical stress intensity factor for crack growth in nanocrystalline materials by a factor of three or more and thus considerably enhances the fracture toughness of such materials.

Recently, Banãdos, Silk and West (BSW) demonstrated that the extremal Kerr black hole can act as a particle accelerator with arbitrarily high center-of-mass energy (E CM) when the collision takes place near the horizon. The rotating... more

Recently, Banãdos, Silk and West (BSW) demonstrated that the extremal Kerr black hole can act as a particle accelerator with arbitrarily high center-of-mass energy (E CM) when the collision takes place near the horizon. The rotating Hayward's regular black hole, apart from Mass (M) and angular momentum (a), has a new parameter g (g > 0 is a constant) that provides a deviation from the Kerr black hole. We demonstrate that for each g, with M = 1, there exist critical a E and r E H , which corresponds to a regular extremal black hole with degenerate horizons, and a E decreases whereas r E H increases with increase in g. While a < a E describe a regular non-extremal black hole with outer and inner horizons. We apply the BSW process to the rotating Hayward's regular black hole, for different g, and demonstrate numerically that the E CM diverges in the vicinity of the horizon for the extremal cases thereby suggesting that a rotating regular black hole can also act as a particle accelerator and thus in turn provide a suitable framework for Plank-scale physics. For a non-extremal case, there always exist a finite upper bound for the E CM , which increases with the deviation parameter g.

We propose a new prediction interval and band for the nonlinear regression model. The construction principle of this interval and band is based on an exact confidence region for parameters of the nonlinear regression model. This region,... more

We propose a new prediction interval and band for the nonlinear regression model. The construction principle of this interval and band is based on an exact confidence region for parameters of the nonlinear regression model. This region, fully described in Vila & Gauchi (2007), provides a rigorous justification for the new prediction interval and band that we propose. This new band is then compared to the classical bands, and also to the band based on the bootstrap resampling method. The comparison of these bands is undertaken with simulated and real data from predictive modelling in food science.

We have performed a search for bursts of gravitational waves associated with the very bright Gamma Ray Burst GRB030329, using the two detectors at the LIGO Hanford Observatory. Our search covered the most sensitive frequency range of the... more

We have performed a search for bursts of gravitational waves associated with the very bright Gamma Ray Burst GRB030329, using the two detectors at the LIGO Hanford Observatory. Our search covered the most sensitive frequency range of the LIGO detectors (approximately 80-2048 Hz), and we specifically targeted signals shorter than ≃150 ms. Our search algorithm looks for excess correlated power between the two interferometers and thus makes minimal assumptions about the gravitational waveform. We observed no candidates with gravitational wave signal strength larger than a pre-determined threshold. We report frequency dependent upper limits on the strength of the gravitational waves associated with GRB030329. Near the most sensitive frequency region, around ≃250 Hz, our root-sum-square (RSS) gravitational wave strain sensitivity for optimally polarized bursts was better than hRSS≃6×10 −21 Hz −1/2 . Our result is comparable to the best published results searching for association between gravitational waves and GRBs.

Three interconnected positions are advocated: (1) although serving as a useful model, the immune self does not exist as such; (2) instead of a self/nonself demarcation, the immune system 'sees' itself, i.e., it does not ignore the... more

Three interconnected positions are advocated: (1) although serving as a useful model, the immune self does not exist as such; (2) instead of a self/nonself demarcation, the immune system 'sees' itself, i.e., it does not ignore the 'self' or attack the 'other;' but exhibits a spectrum of responses, which when viewed from outside the system appear as discrimination of 'self' and 'nonself' based on certain criteria of reactivity. When immune reactions are conceived in terms of normal physiology and open exchange with the environment, where borders dividing host and foreign are elusive and changing, host defense is only part of the immune system's functions, which actually comprise two basic tasks: protection, i.e., to preserve host integrity, and maintenance of organismic identity. And thus (3) if the spectrum of immunity is enlarged, differentiating low reactive 'autoimmune' reactions from activated immune responses against the '...

Simple, low-cost and accurate genotyping methods for single nucleotide polymorphisms (SNPs) are in high demand in the post-genome-sequencing era. We present a graphical tool called SNPicker, implemented in Java, which significantly... more

Simple, low-cost and accurate genotyping methods for single nucleotide polymorphisms (SNPs) are in high demand in the post-genome-sequencing era. We present a graphical tool called SNPicker, implemented in Java, which significantly facilitates the design of mutagenic endonuclease restriction assays. SNPicker uses the online NEB REBASE to automatically scan for all possible designs of mutagenic primers that can facilitate the picking of mismatched PCR primers to artificially introduce or abolish a restriction site at the target SNP site. We successfully applied SNPicker in designing endonuclease restriction assays for 14 SNPs for the MTHFR gene, the Coagulation Factor II gene and the Coagulation Factor V gene. The SNP assays designed using SNPicker were cross-validated using the MassARRAY technology.

We introduce the concept of bi-conformal transformation, as a generalization of conformal ones, by allowing two orthogonal parts of a manifold with metric g to be scaled by different conformal factors. In particular, we study their... more

We introduce the concept of bi-conformal transformation, as a generalization of conformal ones, by allowing two orthogonal parts of a manifold with metric g to be scaled by different conformal factors. In particular, we study their infinitesimal version, called bi-conformal vector fields. We show that these are characterized by the differential conditions £ ξ P ∝ P and £ ξ Π ∝ Π where P and Π are orthogonal projectors (P + Π = g). Keeping P and Π fixed, the set of bi-conformal vector fields is a Lie algebra which can be finite or infinite dimensional according to the dimensionality of the projectors. We determine (i) when an infinite-dimensional case is feasible and its properties, and (ii) a normal system for the generators in the finite-dimensional case. Its integrability conditions are also analyzed, which in particular provides the maximum number of linearly independent solutions. We identify the corresponding maximal spaces, and show a necessary geometric condition for a metric tensor to be a doubletwisted product. More general "breakable" spaces are briefly considered. Many known symmetries are included, such as conformal Killing vectors, Kerr-Schild vector fields, kinematic self-similarity, causal symmetries, and rigid motions.

The electrical conduction behavior of undoped ultrananocrystalline diamond ͑UNCD͒ and its dependence on deposition temperature and chemical structure are presented. UNCD films were grown using a microwave plasma-enhanced chemical vapor... more

The electrical conduction behavior of undoped ultrananocrystalline diamond ͑UNCD͒ and its dependence on deposition temperature and chemical structure are presented. UNCD films were grown using a microwave plasma-enhanced chemical vapor deposition technique at deposition temperatures of 400°C and 800°C. The chemical structure of the UNCD films is characterized with several tools including: Elastic recoil detection analysis, Fourier transform infrared spectroscopy, electron energy loss spectroscopy, Raman spectroscopy, and environmental scanning electron microscope. The results show a higher content of sp 2 -bonded carbon for the 800°C deposition samples ͑ϳ65%͒ in comparison with the 400°C samples ͑ϳ38%͒. In both kinds of films, the hydrocarbon bonds have the saturated sp 3 structures, while there is lower hydrogen content in the 800°C samples ͑ϳ8%͒ than in the 400°C samples ͑ϳ10%͒. For conduction properties, experiments are conducted using a probe station and conductive-atomic force microscopy. Experimental data show that the samples deposited at 800°C are several orders of magnitude more conductive than the 400°C samples. The conduction occurs primarily along the grain boundary for both types of samples. The conductivity of both types of films also shows field dependent nonlinear behavior. Both the Poole-Frenkel models and single and overlapping Coulombic potential models show that the conduction is directly correlated with the sp 2 bond carbon density, and the role of the hydrocarbon bonds in the conduction path is formed by the network of the sp 2 bonded carbon.

In this work we have studied the dynamic scaling behavior of two scaling functions and we have shown that scaling functions obey the dynamic finite size scaling rules. Dynamic finite size scaling of scaling functions opens possibilities... more

In this work we have studied the dynamic scaling behavior of two scaling functions and we have shown that scaling functions obey the dynamic finite size scaling rules. Dynamic finite size scaling of scaling functions opens possibilities for a wide range of applications.

Federal aviation regulations prohibit airline pilots from flying beyond the age of 60 years. However, the relation between pilot age and flight safety has not been rigorously assessed using empirical data. From 1987 to 1997, the authors... more

Federal aviation regulations prohibit airline pilots from flying beyond the age of 60 years. However, the relation between pilot age and flight safety has not been rigorously assessed using empirical data. From 1987 to 1997, the authors followed a cohort of 3,306 commuter air carrier and air taxi pilots who were aged 45-54 years in 1987. During the follow-up period, the pilots accumulated a total of 12.9 million flight hours and 66 aviation crashes, yielding a rate of 5.1 crashes per million pilot flight hours. Crash risk remained fairly stable as the pilots aged from their late forties to their late fifties. Flight experience, as measured by total flight time at baseline, showed a significant protective effect against the risk of crash involvement. With adjustment for age, pilots who had 5,000-9,999 hours of total flight time at baseline had a 57% lower risk of a crash than their less experienced counterparts (relative risk = 0.43, 95% confidence interval: 0.21, 0.87). The protective effect of flight experience leveled off after total flight time reached 10,000 hours. The lack of an association between pilot age and crash risk may reflect a strong "healthy worker effect" stemming from the rigorous medical standards and periodic physical examinations required for professional pilots.

Suppose that a large number of individuals are randomly matched into groups where each group plays a finite symmetric game. Individuals breed true according to their individual material payoffs, but the expected number of surviving... more

Suppose that a large number of individuals are randomly matched into groups where each group plays a finite symmetric game. Individuals breed true according to their individual material payoffs, but the expected number of surviving offspring may depend on the material payoff vector to the whole group. We show that the mean-field equation for the induced population dynamic is equivalent to the replicator dynamic for a game with payoffs derived from those in the original game. We apply this selection dynamic to a number of examples, including prisoners' dilemma games, coordination games, hawk-dove games, a prisoners' dilemma with a punishment option, and common-pool games. For each of these, we provide conditions under which our selection dynamic leads to other outcomes than those obtained under the usual replicator dynamic. By way of a revealedpreference argument, we show how our selection dynamic can explain certain stable behaviors that are consistent with individuals having social preferences.

We describe a new curriculum for the final two years of a B.S. program in Physics. Case studies in the junior year provide concrete examples or Paradigms as pillars to support systematic Capstone lectures in the senior year. In each of... more

We describe a new curriculum for the final two years of a B.S. program in Physics. Case studies in the junior year provide concrete examples or Paradigms as pillars to support systematic Capstone lectures in the senior year. In each of nine three-week Paradigms, the junior progresses from a descriptive lower-division understanding to an advanced analysis of a topic defined by phenomenon rather than discipline. Students generally view the new format with favor. They are better at visualization and make important connections among physics disciplines. Independent assessment is ongoing.

The problem of plasma-wall interaction requires multiscale methods (molecular dynamics and kinetic Monte Carlo) for both the material and the plasma. One possible concept for such multiscale modelling is presented, including validation... more

The problem of plasma-wall interaction requires multiscale methods (molecular dynamics and kinetic Monte Carlo) for both the material and the plasma. One possible concept for such multiscale modelling is presented, including validation strategies.

We establish the conditions under which it is possible to construct signal sets satisfying the properties of being geometrically uniform and matched to additive quotient groups. Such signal sets consist of subsets of signal spaces... more

We establish the conditions under which it is possible to construct signal sets satisfying the properties of being geometrically uniform and matched to additive quotient groups. Such signal sets consist of subsets of signal spaces identified to integers rings Z[i] and Z[ω] in R 2 .

Everywhere in our daily life we encounter nonlinear phenomena. In fact, it is well known that the real life is nonlinear. Because of difficulty of solving nonlinear problems, they used to go for easy way and linearize the problem based on... more

Everywhere in our daily life we encounter nonlinear phenomena. In fact, it is well known that the real life is nonlinear. Because of difficulty of solving nonlinear problems, they used to go for easy way and linearize the problem based on some assumptions. These assumptions may lead to loss of important information in the system. Because of that, and to be more realistic, the objective of this special issue is to use the modern nonlinear theory bifurcation and chaos to establish real nonlinear models for practical systems, engineering sciences, and engineering applications.

We describe a quantum computer emulator for a generic, general purpose quantum computer. This emulator consists of a simulator of the physical realization of the quantum computer and a graphical user interface to program and control the... more

We describe a quantum computer emulator for a generic, general purpose quantum computer. This emulator consists of a simulator of the physical realization of the quantum computer and a graphical user interface to program and control the simulator. We illustrate the use of the quantum computer emulator through various implementations of the Deutsch-Jozsa and Grover's database search algorithm.

A framework for business process management facilitates the implementation, management and improvement of process-based organisations. In order to do so, organisations require new capabilities such as change roadmaps, knowledge management... more

A framework for business process management facilitates the implementation, management and improvement of process-based organisations. In order to do so, organisations require new capabilities such as change roadmaps, knowledge management and balanced performance measures.

This article reviews weblogs and their role as a social networking device for young people and its probable evolution to the organizational computing medium. The motivation of young webloggers, based on an analysis of 33 interviews, and... more

This article reviews weblogs and their role as a social networking device for young people and its probable evolution to the organizational computing medium. The motivation of young webloggers, based on an analysis of 33 interviews, and the evaluation of the fit between technology features and the needs of these webloggers are reported. Four user types with their unique profiles and technology needs, and a needs-technology fit model, are described. The findings support the conjecture that organizational computing will (or should) take advantage of employees' social computing expertise and acceptance and to further capture the attention of customers.

We present research that contributes to the debate on the universal applicability of quality management, and in doing so, subject the 'one size fits all' assumption underlying quality management to an empirical examination. Specifically,... more

We present research that contributes to the debate on the universal applicability of quality management, and in doing so, subject the 'one size fits all' assumption underlying quality management to an empirical examination. Specifically, with the Ž . exception of enhancements in instrumentation, we attempted to replicate, as closely as possible, Anderson et al.'s 1995 empirical evaluation of a Deming-based theory of quality management. In our replication, we applied path analysis to secondary, plant-level data provided by a stratified sample of plants in three different industries in Italy. The path analytical Ž . results from the current replication study were compared against those reported in Anderson et al. 1995 , which used data from U.S.-based plants. These results were then discussed in the context of conducting and interpreting cross-cultural quality management research. We concluded with the need for more replication studies in the quality management discipline and with implications for research and practice. q

The aim of this study was to assess the associations of overweight and obesity with lumbar radicular pain and sciatica using a meta-analysis. We searched the PubMed, Embase, Scopus, and Web of Science databases from 1966 to July 2013. We... more

The aim of this study was to assess the associations of overweight and obesity with lumbar radicular pain and sciatica using a meta-analysis. We searched the PubMed, Embase, Scopus, and Web of Science databases from 1966 to July 2013. We performed a random-effects meta-analysis and assessed publication bias. We included 26 (8 cross-sectional, 7 case-control, and 11 cohort) studies. Both overweight ( pooled odds ratio (OR) = 1.23, 95% confidence interval (CI): 1.14, 1.33; n = 19,165) and obesity (OR = 1.40, 95% CI: 1.27, 1.55; n = 19,165) were associated with lumbar radicular pain. The pooled odds ratio for physician-diagnosed sciatica was 1.12 (95% CI: 1.04, 1.20; n = 109,724) for overweight and 1.31 (95% CI: 1.07, 1.62; n = 115,661) for obesity. Overweight (OR = 1.16, 95% CI: 1.09, 1.24; n = 358,328) and obesity (OR = 1.38, 95% CI: 1.23, 1.54; n = 358,328) were associated with increased risk of hospitalization for sciatica, and overweight/obesity was associated with increased risk of surgery for lumbar disc herniation (OR = 1.89, 95% CI: 1.25, 2.86; n = 73,982). Associations were similar for men and women and were independent of the design and quality of included studies. There was no evidence of publication bias. Our findings consistently showed that both overweight and obesity are risk factors for lumbar radicular pain and sciatica in men and women, with a dose-response relationship. back pain; hospitalization; intervertebral disc displacement; obesity; overweight; sciatica Abbreviations: BMI, body mass index; CI, confidence interval; OR, odds ratio. Subtotal (I 2 = 0.0%, P = 0.759) Figure 3. Results of a meta-analysis of the associations of overweight, overweight/obesity, and obesity with hospitalization due to sciatica and of the association of overweight/obesity with surgery due to sciatica, 1966-2013. The size of the gray shaded area indicates the weight of each study. Horizontal lines show the 95% confidence intervals (CIs). ES, effect size. 934 Shiri et al. Am J Epidemiol. 2014;179(8):929-937 by guest on March 26, 2015 http://aje.oxfordjournals.org/ Downloaded from

Today, there is a growing recognition by researchers and practitioners about the importance of managing knowledge as a critical source for competitive advantage. Various assertions about competitiveness through knowledge management (KM)... more

Today, there is a growing recognition by researchers and practitioners about the importance of managing knowledge as a critical source for competitive advantage. Various assertions about competitiveness through knowledge management (KM) are consistent with results of empirical studies and lessons learned on the knowledge highways and byways. In spite of these macro-level contentions and success stories, there has been little investigation of a systematic means for studying connections between KM activity and competitiveness. This paper advances a knowledge chain model that identi®es and characterizes KM activities an organization can focus on to achieve competitiveness. The model is analogous to Porter's value chain and is grounded in a descriptive KM framework developed via a Delphi-study involving international KM experts. It is comprised of ®ve primary activities that an organization's knowledge processors perform in manipulating knowledge resources, plus four secondary activities that support and guide their performance. Each activity is discussed in detail, including examples. Evidence is provided from the literature illustrating each activity's role in adding value to an organization to increase its competitiveness through improved productivity, agility, reputation, and innovation. In conclusion, we present some observations about avenues for future research to extend, test, and apply the model in business practices. q

In recent years the issue of supply chain risk has been pushed to the fore, initially by fears related to possible disruptions from the much publicised 'millennium bug'. Y2K passed seemingly without incident, though the widespread... more

In recent years the issue of supply chain risk has been pushed to the fore, initially by fears related to possible disruptions from the much publicised 'millennium bug'. Y2K passed seemingly without incident, though the widespread disruptions caused by fuel protests and then Foot and Mouth Disease in the UK, and by terrorist attacks on the USA have underlined the vulnerability of modern supply chains. Despite increasing awareness among practitioners, the concepts of supply chain vulnerability and its managerial counterpart supply chain risk management are still in their infancy. This paper seeks to identify an agenda for future research and to that end the authors go on to clarify the concept of supply chain risk management and to provide a working definition. The existing literature on supply chain vulnerability and risk management is reviewed and compared with findings from exploratory interviews undertaken to discover practitioners' perceptions of supply chain risk and current supply chain risk management strategies.

A generalization of the results of Rasetti and Zanardi concerning avoiding errors in quantum computers by using states preserved by evolution is presented. The concept of the dynamical symmetry is generalized from the level of classical... more

A generalization of the results of Rasetti and Zanardi concerning avoiding errors in quantum computers by using states preserved by evolution is presented. The concept of the dynamical symmetry is generalized from the level of classical Lie algebras and groups, to the level of a dynamical symmetry based on quantum Lie algebras and quantum groups (in the sense of Woronowicz). An intrinsic dependence of the concept of dynamical symmetry on the differential calculus (which holds also in the classical case) is stressed. A natural connection between quantum states invariant under a quantum group action, and quantum states preserved by the dynamical evolution is discussed.

In this article, mixture representations of survival functions of residual lifetimes of k-out-of-n systems are obtained when the components are independent but not necessarily identically distributed. Then we stochastically compare the... more

In this article, mixture representations of survival functions of residual lifetimes of k-out-of-n systems are obtained when the components are independent but not necessarily identically distributed. Then we stochastically compare the residual lifetimes of k-out-of-n systems in one- and two-sample problems. In particular, the results extend some results in Li and Zhao [14], Khaledi and Shaked [13], Sadegh [17], Gurler and Bairamov [7] and Navarro, Balakrishnan, and Samaniego [16]. Applications in the proportional hazard rates model are presented as well.

We consider some nonlinear phenomena in metamaterials with negative refractive index properties. Our consideration includes a survey of previously known results as well as identification of the phenomena that are important for... more

We consider some nonlinear phenomena in metamaterials with negative refractive index properties. Our consideration includes a survey of previously known results as well as identification of the phenomena that are important for applications of this new field. We focus on optical behavior of thin films as well as multi-wave interactions.

Liquid crystal elastomers realize a fascinating new form of soft matter that is a composite of a conventional crosslinked polymer gel (rubber) and a liquid crystal. These solid liquid crystal amalgams, quite similarly to their... more

Liquid crystal elastomers realize a fascinating new form of soft matter that is a composite of a conventional crosslinked polymer gel (rubber) and a liquid crystal. These solid liquid crystal amalgams, quite similarly to their (conventional, fluid) liquid crystal counterparts, can spontaneously partially break translational and/or orientational symmetries, accompanied by novel soft Goldstone modes. As a consequence, these materials can exhibit unconventional elasticity characterized by symmetry-enforced vanishing of some elastic moduli. Thus, a proper description of such solids requires an essential modification of the classical elasticity theory. In this work, we develop a rotationally invariant, nonlinear theory of elasticity for the nematic phase of ideal liquid crystal elastomers. We show that it is characterized by soft modes, corresponding to a combination of long wavelength shear deformations of the solid network and rotations of the nematic director field. We study thermal fluctuations of these soft modes in the presence of network heterogeneities and show that they lead to a large variety of anomalous elastic properties, such as singular length-scale dependent shear elastic moduli, a divergent elastic constant for splay distortion of the nematic director, long-scale incompressibility, universal Poisson ratios and a nonlinear stress-strain relation for arbitrary small strains. These long-scale elastic properties are universal, controlled by a nontrivial zero-temperature fixed point and constitute a qualitative breakdown of the classical elasticity theory in nematic elastomers. Thus, nematic elastomers realize a stable "critical phase", characterized by universal power-law correlations, akin to a critical point of a continuous phase transition, but extending over an entire phase.

This article reports on the development and validation of the Italian SF-36 Health Survey using data from seven studies in which an Italian version of the SF-36 was administered to more than 7000 subjects between 1991 and 1995. Empirical... more

This article reports on the development and validation of the Italian SF-36 Health Survey using data from seven studies in which an Italian version of the SF-36 was administered to more than 7000 subjects between 1991 and 1995. Empirical findings from a wide array of studies and diseases indicate that the performance of the questionnaire improved as the Italian translation was revised and that it met the standards suggested by the literature in terms of feasibility, psychometric tests, and interpretability. This generally satisfactory picture strengthens the idea that the Italian SF-36 is as valid and reliable as the original instrument and applicable and valid across age, gender, and disease. Empirical evidence from a cross-sectional survey carried out to norm the final version in a representative sample of 2031 individuals confirms the questionnaire's characteristics in terms of hypothesized constructs and psychometric behavior and gives a better picture of its external validity (i.e., robustness and generalizability) when administered in settings that are very close to real world.

There are different applications of computer vision and digital image processing in various applied domains and automated production process. In textile industry, fabric defect detection is considered as a challenging task as the quality... more

There are different applications of computer vision and digital image processing in various applied domains and automated production process. In textile industry, fabric defect detection is considered as a challenging task as the quality and the price of any textile product are dependent on the efficiency and effectiveness of the automatic defect detection. Previously, manual human efforts are applied in textile industry to detect the defects in the fabric production process. Lack of concentration, human fatigue, and time consumption are the main drawbacks associated with the manual fabric defect detection process. Applications based on computer vision and digital image processing can address the abovementioned limitations and drawbacks. Since the last two decades, various computer vision-based applications are proposed in various research articles to address these limitations. In this review article, we aim to present a detailed study about various computer vision-based approaches ...

Passenger airline delays have received increasing attention over the past several years as airspace congestion, severe weather, mechanical problems, and other sources cause substantial disruptions to a planned flight schedule. Adding to... more

Passenger airline delays have received increasing attention over the past several years as airspace congestion, severe weather, mechanical problems, and other sources cause substantial disruptions to a planned flight schedule. Adding to this challenge is the fact that each flight delay can propagate to disrupt subsequent downstream flights that await the delayed flights' aircraft and crew. This potential for delays to propagate is exacerbated by a fundamental conflict: slack in the planned schedule is often viewed as undesirable, as it implies missed opportunities to utilize costly perishable resources, whereas slack is critical in operations as a means for absorbing disruption. In this paper, we show how delay propagation can be reduced by redistributing existing slack in the planning process, making minor modifications to the flight schedule while leaving the original fleeting and crew scheduling decisions unchanged. We present computational results based on data from a major U.S. carrier, showing that significant improvements in operational performance can be achieved without increasing planned costs.

The tunable optical diffractive properties of zenithal bistable nematic liquid-crystal gratings are theoretically investigated. The liquid-crystal orientation is rigorously solved via a tensorial formulation of the Landau-de Gennes theory... more

The tunable optical diffractive properties of zenithal bistable nematic liquid-crystal gratings are theoretically investigated. The liquid-crystal orientation is rigorously solved via a tensorial formulation of the Landau-de Gennes theory and the optical transmission properties of the gratings are investigated via full-wave finite-element frequency-domain simulations. It is demonstrated that by proper design the two stable states of the grating can provide nondiffracting and diffracting operation, the latter with equal power splitting among different diffraction orders. An electro-optic switching mechanism, based on dual-frequency nematic materials, and its temporal dynamics are further discussed. Such gratings provide a solution towards tunable beam-steering and beam-splitting components with extremely low power consumption.

MCDM is considered as a complex decision-making tool involving both quantitative and qualitative factors. In recent years, several fuzzy FMCDM tools have been suggested to choosing the optimal probably options. The purpose of this paper... more

MCDM is considered as a complex decision-making tool involving both quantitative and qualitative factors. In recent years, several fuzzy FMCDM tools have been suggested to choosing the optimal probably options. The purpose of this paper is to review systematically the applications and methodologies of the fuzzy multi decision-making (FMCDM) techniques. This study reviewed a total of 403 papers published from 1994 to 2014 in more than 150 peer reviewed journals (extracted from online databases such as ScienceDirect, Springer, Emerald, Wiley, ProQuest, and Taylor & Francis). According to experts' opinions, these papers were grouped into four main fields: engineering, management and business, science, and technology. Furthermore, these papers were categorized based on authors, publication date, country of origin, methods, tools, and type of research (FMCDM utilizing research, FMCDM developing research, and FMCDM proposing research). The results of this study indicated that, in 2013, scholars have published papers more than other years. In addition, hybrid fuzzy MCDM in the integrated method and fuzzy AHP in the individual section were ranked as the first and second methods in use. Additionally, Taiwan was ranked as the first country that contributed to this survey, and engineering was ranked as the first field that has applied fuzzy DM tools and techniques.

For most of the last century, condensed matter physics has been dominated by band theory and Landau's symmetry breaking theory. In the last twenty years, however, there has been an emergence of a new paradigm associated with... more

For most of the last century, condensed matter physics has been dominated by band theory and Landau's symmetry breaking theory. In the last twenty years, however, there has been an emergence of a new paradigm associated with fractionalization, emergent gauge bosons and fermions, topological order, string-net condensation, and long range entanglements. These new physical concepts are so fundamental that they may even influence our understanding of the origin of light and electrons in the universe.

This paper presents a comparison study for the optimization of stencil printing operations using hybrid intelligence technique and response surface methodology (RSM). An average 60% of soldering defects are attributed to solder paste... more

This paper presents a comparison study for the optimization of stencil printing operations using hybrid intelligence technique and response surface methodology (RSM). An average 60% of soldering defects are attributed to solder paste stencil printing process in surface mount assembly (SMA). The manufacturing costs decrease with increasing first-pass yield in the stencil printing process. This study compares two hybrid intelligence approaches with RSM as methods of solving the stencil printing optimization problem that involves multiple performance characteristics. The optimization process is threefold. A data set obtained from an experimental design following data preprocessing process provides an accurate data source for RSM study and training neural networks to formulate the nonlinear model of the stencil printing process with/without combining multiple performance characteristics into a single desirability value, followed by a genetic algorithm searching the trained neural networks for obtaining the optimal parameter sets. The empirical defect-per-millionopportunities (DPMO) measurements demonstrate that the two hybrid intelligence methods can provide satisfactory performance for stencil printing optimization problem.

A recursive least square RLS algorithm for estimation of vehicle sideslip angle and road friction coefficient is proposed. The algorithm uses the information from sensors onboard vehicle and control inputs from the control logic and is... more

A recursive least square RLS algorithm for estimation of vehicle sideslip angle and road friction coefficient is proposed. The algorithm uses the information from sensors onboard vehicle and control inputs from the control logic and is intended to provide the essential information for active safety systems such as active steering, direct yaw moment control, or their combination. Based on a simple two-degree-of-freedom DOF vehicle model, the algorithm minimizes the squared errors between estimated lateral acceleration and yaw acceleration of the vehicle and their measured values. The algorithm also utilizes available control inputs such as active steering angle and wheel brake torques. The proposed algorithm is evaluated using an 8-DOF full vehicle simulation model including all essential nonlinearities and an integrated active front steering and direct yaw moment control on dry and slippery roads.

This paper presents two hybrid genetic algorithms (HGAs) to optimize the component placement operation for the collect-and-place machines in printed circuit board (PCB) assembly. The component placement problem is to optimize (i) the... more

This paper presents two hybrid genetic algorithms (HGAs) to optimize the component placement operation for the collect-and-place machines in printed circuit board (PCB) assembly. The component placement problem is to optimize (i) the assignment of components to a movable revolver head or assembly tour, (ii) the sequence of component placements on a stationary PCB in each tour, and (iii) the arrangement of component types to stationary feeders simultaneously. The objective of the problem is to minimize the total traveling time spent by the revolver head for assembling all components on the PCB. The major difference between the HGAs is that the initial solutions are generated randomly in HGA1. The Clarke and Wright saving method, the nearest neighbor heuristic, and the neighborhood frequency heuristic are incorporated into HGA2 for the initialization procedure. A computational study is carried out to compare the algorithms with different population sizes. It is proved that the performance of HGA2 is superior to HGA1 in terms of the total assembly time.

Quaternion Dirac equation has been analyzed and its supersymmetrization has been discussed consistently. It has been shown that the quaternion Dirac equation automatically describes the spin structure with its spin up and spin down... more

Quaternion Dirac equation has been analyzed and its supersymmetrization has been discussed consistently. It has been shown that the quaternion Dirac equation automatically describes the spin structure with its spin up and spin down components of two component quaternion Dirac spinors associated with positive and negative energies. It has also been shown that the supersymmetrization of quaternion Dirac equation works well for different cases associated with zero mass, nonzero mass, scalar potential and generalized electromagnetic potentials. Accordingly we have discussed the splitting of supersymmetrized Dirac equation in terms of electric and magnetic fields.

This paper studies the combined task of determining a favorable machine configuration and line balancing (MCLB) for an assembly line where a single type of printed circuit board is assembled by a set of interconnected, reconfigurable... more

This paper studies the combined task of determining a favorable machine configuration and line balancing (MCLB) for an assembly line where a single type of printed circuit board is assembled by a set of interconnected, reconfigurable machine modules. The MCLB problem has been solved previously by heuristic methods. In the present work, we give a mathematical formulation for it and transform the model into a linear integer programming model that can be solved using a standard solver for problems of moderate size. The model determines the best machine configuration and allocation of components to the machine modules with the objective of minimizing the cycle time. Because the solutions found in this way are globally optimal, they can be used to evaluate the efficiency of previous heuristics designed for the MCLB problem. In our experiments, an evolutionary algorithm gave near optimal results.

Line balancing of a printed circuit board (PCB) assembly line is considered in the present paper. The production line consists of a number of machines for inserting electronic components on bare PCBs. The aim is to distribute the assembly... more

Line balancing of a printed circuit board (PCB) assembly line is considered in the present paper. The production line consists of a number of machines for inserting electronic components on bare PCBs. The aim is to distribute the assembly operations of a single PCB type to the different machines in such a way that the throughput (i.e., the number of finished PCBs per time unit) of the line is maximized. We suppose that the total time for placements is a linear function of the number of component insertions performed by a machine. Effective mathematical formulations of the balancing problem are then available but previous models omit several aspects having an effect on the actual placement times. In particular, we extend an existing MILP formulation of the problem to consider the usage of feeder modules, precedence constraints among the placement operations, and duplication of frequently used components in several machines. We consider production lines consisting of several gantry-type placement machines. Unlike previous research, we applied standard optimization tools for solving the balancing problems. We then observed that the CPLEX-software was able to solve MILP formulations of 2-and 3-machine problems with up to 150 different component types and relatively large number of component placements (from 400 to 6,000). On the other hand, the running time was rather unstable so that heuristics are still needed for cases where exact methods fail.

In recent years several expert systems have been developed for practical applications in applied statistical methodologies. Existing expert systems in statistics have explored several areas, e.g., the determination of appropriate... more

In recent years several expert systems have been developed for practical applications in applied statistical methodologies. Existing expert systems in statistics have explored several areas, e.g., the determination of appropriate statistical tests, regression analysis, and determination of the "best" experimental design for industrial screening experiments. The DESIGN EXPER T, a prototype expert system for the design of complex statistical experiments is presented here. It is intended for scientific investigators and statisticians who must design and analyze complex experiments, e.g., multi-level medical experiments with nested factors, repeated measures, and both fixed and random effects. This system is "expert" in the sense that it is able to (i)