Basic Research Research Papers - Academia.edu (original) (raw)

The accumulated science base in public research centres (PRCs), current learning processes in firms and emergent knowledge flows among actors are creating knowledge spaces at the regional level in Mexico. These conditions represent an... more

The accumulated science base in public research centres (PRCs), current learning processes in firms and emergent knowledge flows among actors are creating knowledge spaces at the regional level in Mexico. These conditions represent an advantage upon ...

The present paper aims to present legal language in the relevance-theoretic perspective. The focus is on the English legal texts. The basic research question behind the analysis is whether relevance theory, being a psychologist theory of... more

The present paper aims to present legal language in the relevance-theoretic perspective. The focus is on the English legal texts. The basic research question behind the analysis is whether relevance theory, being a psychologist theory of communication, which puts emphasis on individual perception of communicative stimuli, provides an adequate framework for the description of language in restricted semantic domains, especially in the legal context. The paper addresses related issues and attempts to locate legal language analysis within the relevance-theoretic framework.

Enhancing reading comprehension through Strategy Intervention.

This article focuses on issues concerning science and technology relationships posed by the emergence of a new drug discovery method, namely, combinatorial chemistry and biology. We assess the scientific content of combinatorial chemistry... more

This article focuses on issues concerning science and technology relationships posed by the emergence of a new drug discovery method, namely, combinatorial chemistry and biology. We assess the scientific content of combinatorial chemistry and biology using citations in patents to scientific journals and compare this research platform with biotechnology. We also identify the institutional affiliation of all the authors of the cited papers, which leads us to an analysis of knowledge spillovers between the main participants in the research network. Finally, we examine the relevance of localisation in the process of knowledge exchange with regard to EU countries and the US. The result of the analysis provides evidence to support the view that the inventive capacity of a country is dependent upon the basic research which is carried out, especially in universities and public research centres located in the inventor's country.

This paper is an exploration of the synthetic division in compact form. The main goal was to develop an alternative algorithm on division of polynomials whose dividend is P(x) = a 1 x n + a 2 x n−1 + a 3 x n−2 + ⋯ + a n x + a n+1 and the... more

This paper is an exploration of the synthetic division in compact form. The main goal was to develop an alternative algorithm on division of polynomials whose dividend is P(x) = a 1 x n + a 2 x n−1 + a 3 x n−2 + ⋯ + a n x + a n+1 and the divisor is D(x) = b 1 x m + b 2 x m−1 + b 3 x m−2 … + b m x + b m+1 , where n > m, a 1 ≠ 0, b 1 ≠ 0, and a i ′s and b i ′s including a n+1 and b m+1 are constant. The quotient is in the form Q(x) = q 1 x n−m + q 2 x n−m−1 + q 3 x n−m−2 + ⋯ + q n−m x + q n−m+1 while the remainder is R(x) = r 1 x m−1 + r 2 x m−2 + ⋯ + r m−1 + r m. Specifically, this aimed to develop an algorithm using synthetic division in compact arrangement when b 1 =1 and b 1 ≠1, to provide additional conditions to the problems where the usual synthetic divisions are inappropriate, and to generalize the developed algorithm of dividing polynomials of higher degrees. Basic research was employed in this study. This resulted to the development of alternative algorithm initially for quadratic, cubic and quartic divisors. Also, the conditions on inappropriateness of the usual synthetic division were identified. To wit, (a) the divisor D(x) = b 1 x m + b 2 x m−1 + b 3 x m−2 + ⋯ + b m x + b m−1 is prime or irreducible; (b) in case D(x) is reducible, at least one of the factors of D(x) is irreducible non-linear polynomial; and (c) in case D(x) is reducible and its factors are all linear polynomial (either monic or non-monic), one of them is not a factor of P(x). As a generalized result, the quotient is formulated as Q(x) = q 1 x n−m + q 2 x n−m−1 + q 3 x n−m−2 + ⋯ + q n−m x + q n−m+1 where q n = a n − b 2 q n−1 − ⋯ − b m+1 q n−m if b 1 = 1 or q n = a n −b 2 q n−1 −⋯−b m+1 q n−m b 1 if b 1 ≠ 1 while the remainder is R(x) = r 1 x m−1 + ⋯ + r m−3 x 3 + r m−2 x 2 + r m−1 x + r m where r 1 = q n−m+2 , r 2 = q n−m+3 , r 3 = q n−m+4 ,…, r m. Hence, the results comprise algorithms, conditions and generalized formula that can be used as alternative method in dividing polynomials with non-linear polynomial divisors.

Since the discovery that the therapeutic efficacy of antipsychotic drugs was significantly correlated to their ability to block dopamine D 2 receptors, abnormal dopamine transmission in the forebrain has been postulated to underlie... more

Since the discovery that the therapeutic efficacy of antipsychotic drugs was significantly correlated to their ability to block dopamine D 2 receptors, abnormal dopamine transmission in the forebrain has been postulated to underlie psychosis in schizophrenia. In the past 15 years, an impressive amount of clinical and basic research aimed at the study of schizophrenia has indicated that prefrontal and temporal cortical abnormalities may be more important in the etiology of many of the symptoms of schizophrenia, including psychosis. However, the cortical systems that appear to have structural and/or metabolic abnormalities in schizophrenia patients potently regulate forebrain dopamine transmission through a number of mechanisms. In turn, dopamine modulates excitatory transmission mediated by frontal and temporal cortical projections to the basal ganglia and other regions. The present review summarizes the multiple interactions between forebrain DA systems and frontal and temporal corticostriatal transmission. It then examines the role of these interactions in normal behaviors and the psychopa

The vast amount of basic research on platinum coordination complexes has produced, over the past 25 years, several thousand new molecules for preclinical screening and 28 compounds which have entered clinical development. The goals of... more

The vast amount of basic research on platinum coordination complexes has produced, over the past 25 years, several thousand new molecules for preclinical screening and 28 compounds which have entered clinical development. The goals of these research activities have been to identify compounds with superior eYcacy, reduced toxicity, lack of cross-resistance or improved pharmacological characteristics as compared with the parent compound, cisplatin. After the remarkable therapeutic eVects of cisplatin had been established, only a few other platinum compounds succeeded in reaching general availability. Whereas carboplatin is an analogue with an improved therapeutic index (mostly driven by reduced organ toxicity) over that of cisplatin, new compounds clearly more active than or non-crossresistant with cisplatin have not yet been identi®ed. The platinum analogues that remain under investigation are focusing on expanding the utilisation of platinum therapy to tumour types not usually treated with, or responsive to, cisplatin or carboplatin. In addition, novel routes of administration constitute another avenue of research. The clinical development of platinum coordination complexes, with emphasis on those compounds still under active development, is reviewed.

While technological developments have enabled global distribution of software development teams, and wage differentials have stimulated offshore software services export industries such as India’s, the overriding factor in the growth of... more

While technological developments have enabled global distribution of software development teams, and wage differentials have stimulated offshore software services export industries such as India’s, the overriding factor in the growth of globalization in this industry is the search for talented software people. The best software talent is in increasing demand as the software industry itself grows, and as software systems become a competitive tool in many other industries. Unfortunately, global supplies of the best talent are already tapped and there are no significant new sources readily available in the near future. The implications for US trade and public policy are far-reaching.

Background: Risk of cardiovascular disease is assessed, in part, by laboratory measurement of the concentra- tions of several lipoproteins. b-Quantification is a method of lipoprotein measurement that uses ultracen- trifugation to... more

Background: Risk of cardiovascular disease is assessed, in part, by laboratory measurement of the concentra- tions of several lipoproteins. b-Quantification is a method of lipoprotein measurement that uses ultracen- trifugation to partially separate lipoprotein classes. Al- though b-quantification is used largely in clinical and basic research, methods have not been described to allow the analysis of a large number of

Prostate cancer chemoprevention represents a relatively new and promising strategy for reducing the immense public health burden of this devastating cancer of men in the United States and Western societies. Chemoprevention is defined as... more

Prostate cancer chemoprevention represents a relatively new and promising strategy for reducing the immense public health burden of this devastating cancer of men in the United States and Western societies. Chemoprevention is defined as the administration of agents (drugs, biologics, and natural products) that modulate (inhibit) one or more steps in the multistage carcinogenesis process culminating in invasive adenocarcinoma of the prostate. In 2000, there were an estimated 170,000 new cases of prostate cancer and 31,000 deaths in the United States. During the past decade, the National Cancer Institute (NCI) organized the chemoprevention research program and began testing the first generation of promising agents (eg, 4-(hydroxy)-fenretinide [4-HPR], difluoromethylornithine [DFMO], antiandrogens) in high-risk cohorts and launched the first-large scale US phase 3 primary prevention trial, known as Prostate Cancer Prevention Trial (PCPT-1), in 18,000 average-risk men (age more than 55 years and prostate-specific antigen [PSA] less than 3 ng/mL) treated for 7 years with finasteride or placebo. In the summer of 1998, the NCI Prostate Cancer Progress Review Group (PRG) Report to the director of NCI was published in response to the leadership of the prostate cancer advocacy community in conjunction with Congress. To further elucidate and address critical issues identified in this report and to develop a research agenda for the newly created Prostate and Urologic Cancer Research Group in the Division of Cancer Prevention at NCI, the NCI organized the workshop "New Clinical Trial Strategies for Prostate Cancer Chemoprevention." The major objectives were to promote understanding and cooperation among the NCI, US Food and Drug Administration (FDA), academia, pharmaceutical industry, and the public regarding new opportunities for clinical prevention trials for prostate cancer. The workshop was divided into three concurrent breakout panels and a fourth joint integrative panel. The workshop addressed multiple key areas identified in the PRG report in the following panels: (1) Molecular Targets and Promising Agents in Clinical Development; (2) Intermediate Endpoint Biomarkers for Prevention Trials; (3) High-Risk Study Populations for Prevention Trials, and (4) Preventive Clinical Trial Designs and Regulatory Issues. Expert panelists were drawn from leading academic, pharmaceutical, and government scientists in basic research and clinical investigation. Key pharmaceutical, biotechnology, academic, and National Institutes of Health scientists presented overviews of their new agents and products in clinical development (representing the next generation of promising agents). Senior FDA physicians from the Center for Drugs and Center for Biologics presented on current standards for new drug and biologic approval for chemoprevention efficacy. Some of the key topics included recent advances in the state of knowledge of promising agents in the clinic based on molecular targets as well as bottlenecks in drug development for pharmaceutical sponsors; strategic modulable biomarkers that can serve as primary endpoints in phase 1/2 trials to assess preventive efficacy; high-risk cohorts with precancer (high-grade prostatic intraepithelial neoplasia) and representative clinical trial designs that are ready for immediate translation into efficient prevention trials, such as Bayesian sequential monitoring for early assessment of biologic activity and factorial designs for assessment of multiagent combinations. Finally, each expert panel generated recommendations for areas of future research emphasizing opportunities and infrastructure needs.

The adoption of a common formalism for describing planning domains fosters far greater reuse of research and allows more direct comparison of systems and approaches, and therefore supports faster progress in the field. A common formalism... more

The adoption of a common formalism for describing planning domains fosters far greater reuse of research and allows more direct comparison of systems and approaches, and therefore supports faster progress in the field. A common formalism is a compromise between expressive power (in which development is strongly driven by potential applications) and the progress of basic research (which encourages development from well-understood foundations). The role of a common formalism as a communication medium for exchange demands that it is provided with a clear semantics. This paper describes extensions of PDDL2.1 (used in the 3rd International Planning Competition) that support the modelling of continuous time-dependent effects and illustrates why they can play a critical role in modelling real domains.

Purpose. Motor skills can be learned in an explicit or an implicit manner. Explicit learning places high demands on working memory capacity, but engagement of working memory is largely circumvented when skills are learned implicitly. We... more

Purpose. Motor skills can be learned in an explicit or an implicit manner. Explicit learning places high demands on working memory capacity, but engagement of working memory is largely circumvented when skills are learned implicitly. We propose that individuals with impaired movement dynamics may benefit from implicit learning methods when acquiring sportsrelated motor skills. Method. We discuss converging evidence that individuals with cerebral palsy and children born prematurely have compromised working memory capacity. This may in part explain the difficulties they encounter when learning motor and other skills. We also review tentative evidence that older people, whose movement dynamics deteriorate, can implicitly learn sports-related motor skills and that this results in more durable performance gains than explicit learning. Results. Individuals with altered movement dynamics and compromised working memory can benefit from implicit motor learning. Conclusion. We conclude with an appeal for more extensive evaluation of the merits of implicit motor learning in individuals with impaired movement dynamics.

Building automation (BA) and smart homes (SHs) have traditionally not been a unified field but varied by their origins, legal foundations, different applications, different goals, and national funding programs for basic research. Only... more

Building automation (BA) and smart homes (SHs) have traditionally not been a unified field but varied by their origins, legal foundations, different applications, different goals, and national funding programs for basic research. Only within the last years that an international common focus appeared. The following overview gives not only an introduction into the topic of BA but also the distinction to other areas of automation, in which networks of the field level (the sensor and actuator level) play an important role. Finally, the scientific challenges will be mentioned. SHs are referred to when the differences to BA have to be explicitly stressed. This paper is an introduction for the special IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS section on BA and shall introduce the reader to this new topic. BA not only has a huge economic potential but also is of significant academic interest today.

Many studies have explored the effects of immunotherapy, alone or in combination with conventional therapies, on both experimental and human cancers. Evidence has been provided that combined treatments with thymosin alpha 1 (T a 1) and... more

Many studies have explored the effects of immunotherapy, alone or in combination with conventional therapies, on both experimental and human cancers. Evidence has been provided that combined treatments with thymosin alpha 1 (T a 1) and low doses of interferon (IFN) or interleukin (IL)-2 are highly effective in restoring several immune responses depressed by tumor growth and/or cytostatic drugs. In addition, when combined with specific chemotherapy, they are able to increase the anti-tumor effect of chemotherapy while markedly reducing the general toxicity of the treatment. The advantages of using this combined chemo-immunotherapeutic approach in experimental and human cancers are reviewed in this issue. : S 0 1 9 2 -0 5 6 1 ( 0 0 ) 0 0 0 7 5 -8 E. Garaci et al.

Surface flashover in compressed-gas insulated systems is a much studied, but poorly explained phenomenon. In this paper we review the literature of surface flashover with primary emphasis on the understanding of physical processes leading... more

Surface flashover in compressed-gas insulated systems is a much studied, but poorly explained phenomenon. In this paper we review the literature of surface flashover with primary emphasis on the understanding of physical processes leading to discharge initiation and insulator flashover under high voltage excitation. The flashover models presently in vogue will first be discussed, followed by the results of some recent experiments which are likely to have an impact on further modeling. Included in this context are phenomena such as ionization, surface charging, partial discharges, optical activity, and gas/dielectric interactions. Finally, the influence of system parameters such as insulator size, shape, surface condition, triple junction geometry, voltage waveform, gas formulation and particle contamination are discussed with regard to their effect on the flashover characteristics. Mechanisms are suggested in an effort to provide a physical explanation for the observed phenomena. Although the physics of the discharge initiation and propagation processes are presently not well understood, and the present models only account for a few of the mechanisms known to be important in the discharge development, all the work points to an interaction between the spacer and the various electron/photon processes in the surrounding gas volume. This interaction has not been accounted for in the discharge models proposed to date. Further modeling work should incorporate these interactions and the intrinsic properties of the dielectric which are related to these interactions. More basic research is suggested to provide a better understanding of the physics of the discharge initiation and breakdown phenomena. '0) I-9-7 / 836/ 1 CJ'X C-i-O2 .' 7$c.I 0 2-1 96 iTEEE TEEE Tansctin~s E1m ti a1Iz-ti EVlTl-KM No.- .-tober I 9Rbs o few guidelines existed for designing a better spacer. Clearly this indicates a strong need to investigate the fundamental processes relevant to the flashover mechanism. A more recent review by Laghari and Quereshi [5] was also of an applied nature, perhaps to an even greater degree. This review detailed the flashover characteristics of spacers used in compressed-gas insulation transmission (CGIT) systems, with particular emphasis on systems containing conducting particulate contamination. The review covered most of the items of interest to CGIT systemse.g. gas insulation, spacer materials, spacer geometries, arc resistance, partial discharge behavior, waveform of voltage application, shielded electrodes, inserts, and contamination due to conducting particles. In contrast to these earlier reviews, the review presented here is of a more fundamental nature. In this review there will be a concentration on the physical processes which lead to and culminate in breakdown of insulator-bridged gas insulated systems. As there has not been a great amount of work of a fundamental nature reported in the literature, it will be necessary to rely, to some extent, on practical observations in order to extract data with regard to the physical processes. For this reason, papers included in this review are as likely to deal with practical observations as are to deal with-fundamental measurements. Topics such as flashover modeling, measurements of surface charge densities, partial discharge characteristics, and optical activity will be discussed along with the more practical observations of insulator shape effects, surface condition, triple junction geometry, etc.

Bioenergetic models of fish have been used to study a large number of processes. Like most models, bioenergetic models require the estimation of numerous parameters. As a consequence, they have often relied on parameters borrowed from... more

Bioenergetic models of fish have been used to study a large number of processes. Like most models, bioenergetic models require the estimation of numerous parameters. As a consequence, they have often relied on parameters borrowed from other species or values extrapolated from other life stages or size-classes. The magnitude of the biases associated with these practices remains largely unknown. The objective of this study was to determine whether or not metabolic rates could be extrapolated between closely related species and life stages. We focused on Pacific salmon Oncorhynchus spp. and steelhead O. mykiss, as the metabolic rates of these species have been well documented. Our analyses showed that models derived from closely related species did not accurately predict the metabolic rates of salmon, indicating that the practice of ''species borrowing'' should be avoided in assessing fish metabolic rates. Our work also showed that allometric equations of metabolic rates were not stable when measured over small size ranges and that biases could be introduced when these models are extrapolated to smaller or larger fish. In addition, we found that, except for steelhead, published models of Pacific salmon metabolic rates were inaccurate. Mean bias ranged from close to 0% to over 200% depending on size and water temperature, with about two-thirds of the bias values exceeding 20%. These biases could have large impacts on bioenergetic model predictions. Increased funding and scientific recognition of the need for additional basic research will be necessary to build more accurate bioenergetic models for Pacific salmon, especially models applied to the ocean phase of the life cycle.

Social competence is of great interest to developmental psychopathologists and is assessed frequently in both basic and applied research. A review of the literature reveals not only significant heterogeneity in definitions of this... more

Social competence is of great interest to developmental psychopathologists and is assessed frequently in both basic and applied research. A review of the literature reveals not only significant heterogeneity in definitions of this construct but an apparent disconnect between theoretical, measurement, and intervention models of competence in youth. This paper attempts to integrate these disparate enterprises by identifying four types of predictors theorists have associated with competencechild, behavior, situation, and judgeand critiquing common models in light of these dimensions. In general, assessment and intervention approaches appear less complex than theoretical conceptualizations of competence. When considering whether to incorporate additional predictors, notably situation-and judge-level factors, into these models, researchers must weigh parsimony versus the model misspecification that results from omitting important variables. Basic research that may assist in making these decisions is identified.

The proposed material is a nontrivial approach to determining the conditions and causes of cold fusion processes. The basis for these definitions was a logical analysis of the data obtained as a result of research into the functional... more

The proposed material is a nontrivial approach to determining the conditions and causes of cold fusion processes. The basis for these definitions was a logical analysis of the data obtained as a result of research into the functional structures of nuclear symbols. Periodic changes in the combination of fragments of functional projections of nucleons in the structure of the nucleus allowed us to establish two main processes that form the polar content of a nuclear charge. These are the opposite processes of compression and expansion of systemic fragmentary complexes of periodic projections, distributed by the functions of protons and neutrons. As a result of these data, a cyclic mathematical table of the algorithmic sequence of nuclear symbols was compiled, which determines the real numbers of protons and neutrons in various compositions of nucleons. Based on the proposed sequence, data are given that determine the causes and conditions for the emergence of the mechanism of cold synthesis. These results relate to the functional structure of the hydrogen charge volume. The compression and expansion processes occurring in the functional structure of the unit determine the causes of the systems, the order of expression of functional fragments of nuclear symbols in the periods of the hydrogen circuit. The reason for these processes is the unstable polarization of the functional structures of the composition of volume fragments. This instability is determined by the general state of the external and internal content of the conditions for the formation of a functional specific volume contour. In determining the conditions for the formation of each volume contour, there

The Linguistic Data Consortium at the University of Pennsylvania has recently been engaged in the creation of large-scale annotated corpora of broadcast news materials in support of the ongoing Topic Detection and Tracking (TDT) research... more

The Linguistic Data Consortium at the University of Pennsylvania has recently been engaged in the creation of large-scale annotated corpora of broadcast news materials in support of the ongoing Topic Detection and Tracking (TDT) research project. The TDT corpora were designed to support three basic research tasks: segmentation, topic detection, and topic tracking in newswire, television and radio sources from English and Mandarin Chinese. The most recent TDT corpus, TDT3, added two tasks, story link and first story detection. Annotation of the TDT corpora involved a large staff of annotators who produced millions of human judgements. As with any large corpus creation effort, quality assurance and inter-annotator consistency were a major concern. This paper reports the quality control measures adopted by the LDC during the creation of the TDT corpora, presents techniques that were utilized to evaluate and improve the consistency of human annotators for all annotation tasks, and discu...

Clinical, epidemiological and basic research evidence clearly supports the inclusion of regular physical activity as a tool for the prevention of chronic disease and the enhancement of overall health. In children, activities of a moderate... more

Clinical, epidemiological and basic research evidence clearly supports the inclusion of regular physical activity as a tool for the prevention of chronic disease and the enhancement of overall health. In children, activities of a moderate intensity may enhance overall health, and assist in preventing chronic disease in at-risk youth. The numerous health benefits of regular exercise are dependent on the type, intensity and volume of activity pursued by the individual. These benefits include reduction of low density lipoproteins while increasing high density lipoprotein; improvement of glucose metabolism in patients with type II diabetes; improved strength, self esteem and body image; and reduction in the occurrence of back injuries. In addition, a progressive, moderate-intensity exercise program will not adversely effect the immune system and may have a beneficial effect on the interleukin-2/natural killer cell system. Furthermore, by decreasing sedentary behaviors and, thus, increasing daily physical activity, individuals may experience many stress-reducing benefits, which may enhance the immune system. Conclusion Moderate intensity exercise of a non-structured nature seems to facilitate most of the disease prevention goals and health promoting benefits. With new guidelines promoting a less intense and more time-efficient approach to regular physical activity, it is hoped that an upward trend in the physical activity patterns, and specifically children at risk for chronic disease, will develop in the near future.

Macromolecular modeling and design are increasingly useful in basic research, biotechnology, and teaching. However, the absence of a user-friendly modeling framework that provides access to a wide range of modeling capabilities is... more

Macromolecular modeling and design are increasingly useful in basic research, biotechnology, and teaching. However, the absence of a user-friendly modeling framework that provides access to a wide range of modeling capabilities is hampering the wider adoption of computational methods by non-experts. RosettaScripts is an XML-like language for specifying modeling tasks in the Rosetta framework. RosettaScripts provides access to protocol-level functionalities, such as rigid-body docking and sequence redesign, and allows fast testing and deployment of complex protocols without need for modifying or recompiling the underlying C++ code. We illustrate these capabilities with RosettaScripts protocols for the stabilization of proteins, the generation of computationally constrained libraries for experimental selection of higher-affinity binding proteins, loop remodeling, small-molecule ligand docking, design of ligand-binding proteins, and specificity redesign in DNA-binding proteins.

A national cooperative project on rare paediatric tumours (the TREP project) was launched in 2000 in Italy, with a view to improving the clinical management and the basic research on these 'orphan' tumours, defined as those childhood... more

A national cooperative project on rare paediatric tumours (the TREP project) was launched in 2000 in Italy, with a view to improving the clinical management and the basic research on these 'orphan' tumours, defined as those childhood solid malignancies characterised by an annual incidence < 2/million and not considered in other clinical trials. This paper describes the process that the group developed and the problems it had to face, and aims to stimulate a debate on the rationale, scientific relevance and feasibility of running scientific research programs on rare childhood neoplasms. In the first phase of its activity, the project developed diagnostic and therapeutic recommendations for each rare tumour and established a collaborative network between 'experts' dedicated to each histotype and other specialists (i.e. adult oncologists, surgeons) involved in the management of these tumours. From 2000 to 2005, 297 patients have been registered from 35 Italian centres. This experience demonstrates the feasibility of a national multidisciplinary cooperation on rare paediatric malignancies and suggests that international studies could be realised.

This paper reviews the literature concerning the use of surface electromyography (sEMG) for the study of the neck musculature in response to work and workplace design during light work and semi-static tasks. The paper also draws upon... more

This paper reviews the literature concerning the use of surface electromyography (sEMG) for the study of the neck musculature in response to work and workplace design during light work and semi-static tasks. The paper also draws upon basic research and biomechanical modeling in order to provide methodological recommendations for the use of surface electromyography in this region of the body and to identify areas which require further investigation. The paper includes review and discussion of electrode site location, methods of normalization, data reliability, and factors that can affect sEMG signals from this region, including noise, physiologic artifact, stress, visual deficiencies, and pain. General guidance for maximum exertions with the neck musculature, for sEMG normalization or other purposes, is also included.

During the past 30 years, there have been huge breakthroughs in HIV research, the most important of which have been the development of antiretroviral drugs (ARVs). However, the ability of HIV to rapidly mutate and replicate has given rise... more

During the past 30 years, there have been huge breakthroughs in HIV research, the most important of which have been the development of antiretroviral drugs (ARVs). However, the ability of HIV to rapidly mutate and replicate has given rise to the development of drug resistance that threatens to reverse much of the progress that has been accomplished in HIV therapy. New research has shown that ARVs can even be used as preventive tools if administered to HIV vulnerable persons as tablets or vaginal microbicides in advance of possible HIV exposure. Furthermore, the successful treatment of HIV infected individuals can render HIV-positive individuals virtually non-infectious in many cases because the amount of virus in blood, tissues, and genital fluids is usually greatly reduced. This progress has led to a significant diminution in numbers of both infants who acquire HIV from their mothers as well as the sexual partners of infected, treated persons. The problem of drug resistance is like...

Acute stress is a fundamental adaptive response which enables an organism to cope with daily threatening environmental stimuli. If prolonged and uncontrollable, the stress response may become inadequate and ultimately result in health... more

Acute stress is a fundamental adaptive response which enables an organism to cope with daily threatening environmental stimuli. If prolonged and uncontrollable, the stress response may become inadequate and ultimately result in health damage. Animal models of stress in rodents indicate that both acute and chronic stressors have pronounced effects on sleep architecture and circadian rhythms. One major physiological response elicited by stress is activation of the hypothalamo-pituitary-adrenal axis. In both animals and humans, the hypothalamo-pituitary-adrenal axis plays an important role in sleep-wake regulation and in alterations of the sleep-wake cycle secondary to exposure to acute or chronic stressors. In humans, dysfunction of the neuroendocrine regulation of sleep can lead to severe sleep disturbances. The progressive decay of the hypothalamo-pituitary-adrenal axis in elderly people, which mimics chronic exposure to stress, may contribute to fragmented and unstable sleep in ageing. Shift workers, chronic insomniacs or patients suffering from mental disorders show abnormal hypothalamo-pituitary-adrenal secretory activity and concomitant sleep disturbances. Those sleep disorders and possible underlying mechanisms are briefly reviewed.

Our understanding of nociceptive processing and of plastic changes after persistent noxious input has increased immensely within the last two decades. It is now clear that long-lasting noxious stimulation or damage to the nervous system... more

Our understanding of nociceptive processing and of plastic changes after persistent noxious input has increased immensely within the last two decades. It is now clear that long-lasting noxious stimulation or damage to the nervous system give rise to a neuronal hyperexcitability and that this sensitisation of the nervous system plays an important role for development and maintenance of chronic pain. The manifestations of such hyperexcitability are numerous and include among others: increased neuronal response to a suprathreshold stimulus, expansion of the peripheral areas from where a central neurone can be activated and the recruitment of previous non-responding nociceptive neurones. Furthermore, it has been possible to modulate this neuronal hyperexcitability by the discovery of molecular targets for pain, by sequencing DNA of ion channels and receptors and by development of new molecules that exert their effects on these molecular targets. The changes in responsiveness appear to be partly time and intensity dependent and partly dependent on the cause of injury. Whereas relatively short-lasting and moderate noxious input leads to reversible plastic changes, more intense and long-lasting noxious stimulation implies a risk for persistent and more profound alterations in transmitters, receptors, ion channels and in neuronal connectivity. P AIN IS A major clinical problem in society and the number of patients suffering from chronic pain is enormous. Within the last two decades the field of pain research has resulted in a considerable advance in our understanding of pain mechanisms: Molecular targets such as the vanilloid and opioid receptors together with different ion channels have been unravelled (1-4), spinal neuroplastic changes following a sustained noxious input have been described (5) and it has been possible by new imaging techniques to obtain insight into a former black box, the brain, and observe the processing of acute and chronic pain (6). The area is exploding with new information. Whereas it is obvious that acute and postoperative types of pain are better handled now than previously, progress in the treatment of chronic pain conditions, in particular neuropathic pain conditions, has been at a much slower pace. In fact, it is not at all clear that the advance in knowledge has resulted in a better clinical treatment of those suffering from chronic pain. The rapid basic expansion in terms of DNA sequencing of 1128 Despite the explosion of new knowledge in pain processing and in molecular background for neuroplasticity, this progress has unfortunately not resulted in a corresponding improvement of our ability to treat chronic pain. The number of patients with chronic unrelieved pain is still high and newer types of treatment have so far not resulted in a substantially better treatment. Nevertheless, there is now an ongoing systematic research in which chronic pain conditions are assessed in a fashion so that mechanisms underlying pain can be dissected. Moreover, controlled clinical trials together with systematic reviews are carried out which in the future should permit formulation of treatment algorithms for chronic pain. Finally, it is likely that the development of new specific types of treatment will show efficacy if they are evaluated and analysed not on the global pain experience, but more specifically on those targets and elements of the pain experience they are aimed to deal with. receptors and ion channels and the development in understanding neuroplastic changes in the nervous system following noxious input stands in sharp contrast to our handling of chronic pain patients. It is not at all clear that the number of chronic pain patients has been reduced -and in many instances treatment of chronic pain is still disappointing (7). In the following we will briefly describe areas where basic research has contributed or is expected to contribute to a better handling of chronic types of pain.

FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1 keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently... more

FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1 keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently the code is maintained on Linux. The validity of the physical models implemented in FLUKA has been benchmarked against a variety of experimental data over a wide energy range, from accelerator data to cosmic ray showers in the Earth atmosphere. FLUKA is widely used for studies related both to basic research and to applications in particle accelerators, radiation protection and dosimetry, including the specific issue of radiation damage in space missions, radiobiology (including radiotherapy) and cosmic ray calculations.

The adoption of a common formalism for describing planning domains fosters far greater reuse of research and allows more direct comparison of systems and approaches, and therefore supports faster progress in the field. A common formalism... more

The adoption of a common formalism for describing planning domains fosters far greater reuse of research and allows more direct comparison of systems and approaches, and therefore supports faster progress in the field. A common formalism is a compromise between expressive power (in which development is strongly driven by potential applications) and the progress of basic research (which encourages development from well-understood foundations). The role of a common formalism as a communication medium ...

There is growing interest in language barriers in health care (LBHC) -interest, that is, in how the quality of health care service delivery might be compromised when patients and health care providers do not share the same first language.... more

There is growing interest in language barriers in health care (LBHC) -interest, that is, in how the quality of health care service delivery might be compromised when patients and health care providers do not share the same first language. This article discusses LBHC as an emerging research area that provides valuable opportunities for researchers in various branches of the language sciences -including, among others, applied linguistics, theoretical linguistics, psycholinguistics, second language acquisition -to conduct basic research and to make contributions to the socially important area of medical communication. This article also proposes a research agenda aimed at attracting general language researchers to the study of LBHC, an agenda that is theory driven, programmatic, problem-solving oriented, and interdisciplinary in scope. In proposing this agenda, selected examples have been reviewed from the current literature that can serve as illustrative models for how future research into LBHC can proceed.

Apart from an early case report from China (13th century) and later artistic contributions, the ®rst observations on insects and other arthropods as forensic indicators were documented in Germany and France during mass exhumations in the... more

Apart from an early case report from China (13th century) and later artistic contributions, the ®rst observations on insects and other arthropods as forensic indicators were documented in Germany and France during mass exhumations in the late 1880s by Reinhard and Hofmann, whom we propose recognizing as co-founders of the discipline. After the French publication of Me Âgnin's popular book on the applied aspects of forensic entomology, the concept quickly spread to Canada and the US. At the time, researchers recognized that the lack of systematic observations of forensically important insects stood in the way of their use as indicators of postmortem interval. General advances in insect taxonomy, and ecology helped close this gap over the following decades.

The physiology of erection has received intense clinical and basic research scrutiny over the past two decades. This massive effort has led to a much clearer understanding of the macroscopic aspects of erection as well as identification... more

The physiology of erection has received intense clinical and basic research scrutiny over the past two decades. This massive effort has led to a much clearer understanding of the macroscopic aspects of erection as well as identification of the prominent features of the etiology of erectile dysfunction (ED). However, it is clear that the devil is in the details of the erectile process. Therefore, to truly understand the precise mechanistic basis for erection and ED, much more still must be learned about how the biochemical cascades in the corporal smooth muscle cells are integrated to produce a normal erection, or how they are altered to result in ED. The ultimate goal of these basic research and clinical efforts will be to provide a rational scientific basis for mechanism-based, patient-specific therapies for ED. This article reviews fundamental aspects of the physiology of erection and summarizes the most recent information available concerning the putative biochemical correlates o...

Basic research in pattern formation is concerned with the generation of phenotypes and tissues. It can therefore lead to new tools for medical research. These include phenotypic screening assays, applications in tissue engineering, as... more

Basic research in pattern formation is concerned with the generation of phenotypes and tissues. It can therefore lead to new tools for medical research. These include phenotypic screening assays, applications in tissue engineering, as well as general advances in biomedical knowledge. Our aim here is to discuss this emerging field with special reference to tools based on zebrafish developmental biology. We describe phenotypic screening assays being developed in our own and other labs. Our assays involve: (i) systemic or local administration of a test compound or drug to zebrafish in vivo; (ii) the subsequent detection or "readout" of a defined phenotypic change. A positive readout may result from binding of the test compound to a molecular target involved in a developmental pathway. We present preliminary data on assays for compounds that modulate skeletal patterning, bone turnover, immune responses, inflammation and early-life stress. The assays use live zebrafish embryos and larvae as well as adult fish undergoing caudal fin regeneration. We describe proof-of-concept studies on the localised targeting of compounds into regeneration blastemas using microcarriers. Zebrafish are cheaper to maintain than rodents, produce large numbers of transparent eggs, and some zebrafish assays could be scaled-up into medium and high throughput screens. However, advances in automation and imaging are required. Zebrafish cannot replace mammalian models in the drug development pipeline. Nevertheless, they can provide a cost-effective bridge between cell-based assays and mammalian whole-organism models.

Se analizan las tendencias en la investigacion basica sobre psicologia del aprendizaje y su impacto, con base en los trabajos publicados en las principales revistas de psicologia en espanol: Revista Latinoamericana de Psicologia (RLP),... more

Se analizan las tendencias en la investigacion basica sobre psicologia del aprendizaje y su impacto, con base en los trabajos publicados en las principales revistas de psicologia en espanol: Revista Latinoamericana de Psicologia (RLP), Psicothema (PT), Revista Mexicana de Psicologia (RMP), y Revista Interamericana de Psicologia (RIP). Las tres primeras estan ?indexadas? por el Institute for Scientific Information (ISS), y la cuarta lo estuvo hasta hace pocos anos. Los porcentajes de articulos publicados sobre investigacion basica en aprendizaje son los siguientes: RLP (40%), PT (38.3%), RIP (15%), y RMP (6.7%). Los paises con mayor numero de investigaciones publicadas son Espana, Mexico, Argentina y Colombia. La decada mas productiva fue la mas reciente, 1990-1999. Las especies mas estudiadas han sido ratas, humanos y palomas. Los tipos de aprendizaje fueron discriminacion de estimulos, condicionamiento instrumental aversivo, condicionamiento clasico, y condicionamiento instrumental...

The Impact Dynamics Research Facility (IDRF) is a 240-ft.-high gantry structure located at

Molecular epidemiology uses biomarkers and advanced technology to refine the investigation of the relationship between environmental exposures and diseases in humans. It requires careful handling and storage of precious biological samples... more

Molecular epidemiology uses biomarkers and advanced technology to refine the investigation of the relationship between environmental exposures and diseases in humans. It requires careful handling and storage of precious biological samples with the goals of obtaining a large amount of information from limited samples, and minimizing future research costs by use of banked samples. Many factors, such as tissue type, time of collection, containers used, preservatives and other additives, transport means and length of transit time, affect the quality of the samples and the stability of biomarkers and must be considered at the initial collection stage. An efficient study design includes provisions for further processing of the original samples, such as cryopreservation of isolated cells, purification of DNA and RNA, and preparation of specimens for cytogenetic, immunological and biochemical analyses. Given the multiple uses of the samples in molecular epidemiology studies, appropriate informed consent must be obtained from the study subjects prior to sample collection. Use of barcoding and electronic databases allow more efficient management of large sample banks. Development of standard operating procedures and quality control plans is a safeguard of the samples' quality and of the validity of the analyses results. Finally, specific state, federal and international regulations are in place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples, as well as communication of study results.

The use of ice slurries dates back many millenniums, e.g. the ancient Romans applied the cooling of snow-water and icewater mixtures. Approximately two decades ago a breakthrough of the new technology of producing ice slurries has set in... more

The use of ice slurries dates back many millenniums, e.g. the ancient Romans applied the cooling of snow-water and icewater mixtures. Approximately two decades ago a breakthrough of the new technology of producing ice slurries has set in the refrigeration domain for the cooling of shops and supermarkets. After some difficulties at the beginning, it is now possible to build systems, which operate as designed. However, there is still a huge potential to lower the energy consumption and the costs of the systems. Actions in this direction are the design of new ice slurry generators, the development of new concepts for storage and mixing, etc. In this article a short review of the basic research on ice slurries is presented. Furthermore, practical problems of the application of the technology in refrigeration and process techniques are discussed. q (P.W. Egolf).

Nowadays, the improvement of speech recognition technology is growing fast and many techniques are presented. However, even the best algorithm with carefully designed system cannot accomplish good performance speech recognition if the... more

Nowadays, the improvement of speech recognition technology is growing fast and many techniques are presented. However, even the best algorithm with carefully designed system cannot accomplish good performance speech recognition if the system is trained from poor corpus. Therefore, the speech corpus is the basic research that is necessary and should be consistent, balanced, as well as covering all

This paper presents a unitary approach to emotion and emotion regulation, building on the excellent points in the lead article by Cole, Martin, and Dennis (this issue), as well as the fine commentaries that follow it. It begins by... more

This paper presents a unitary approach to emotion and emotion regulation, building on the excellent points in the lead article by Cole, Martin, and Dennis (this issue), as well as the fine commentaries that follow it. It begins by stressing how, in the real world, the processes underlying emotion and emotion regulation appear to be largely one and the same, rendering the value of the distinction largely for the benefit of analysis. There is an extensive discussion of how the same processes can generate emotions (i.e., are constitutive of emotion) and account for variability of manifestation of emotion in context (i.e., regulate them). Following an extensive review of many of the principles involved in emotion and emotion regulation, the paper presents implications for developmental study of infants and children, includes several methodological recommendations, and concludes with an analysis of the extent to which contemporary affective neuroscience contributes to the study of emotion and emotion regulation.

This study aims to trace the conceptual evolutionary path of theories on corporate social responsibility (CSR) and to reflect on the implications of the development. The retrospection has revealed that the trend has been a progressive... more

This study aims to trace the conceptual evolutionary path of theories on corporate social responsibility (CSR) and to reflect on the implications of the development. The retrospection has revealed that the trend has been a progressive rationalization of the concept with a particular focus on tighter coupling with organizations' financial goals. Rationalization involves two broad shifts in the conceptualization of CSR. First, in terms of the level of analysis, researchers have moved from the discussion of the macro-social effects of CSR to organizational-level analysis of CSR's effect on profit. Next, in terms of theoretical orientation, researchers have moved from explicitly normative and ethics-oriented arguments to implicitly normative and performance-oriented managerial studies. Based on the retrospection, the limitations of the current state of CSR research that places excessive emphasis on the business case for CSR are outlined, and it is suggested that future research needs to refocus on basic research in order to develop conceptual tools and theoretical mechanisms that explain changing organizational behavior from a broader societal perspective.

Empirical studies of information retrieval methods show that good retrieval performance is closely related to the use of various retrieval heuristics, such as TF-IDF weighting. One basic research question is thus what exactly are these... more

Empirical studies of information retrieval methods show that good retrieval performance is closely related to the use of various retrieval heuristics, such as TF-IDF weighting. One basic research question is thus what exactly are these "necessary" heuristics that seem to cause good retrieval performance. In this paper, we present a formal study of retrieval heuristics. We formally define a set of basic desirable constraints that any reasonable retrieval function should satisfy, and check these constraints on a variety of representative retrieval functions. We find that none of these retrieval functions satisfies all the constraints unconditionally. Empirical results show that when a constraint is not satisfied, it often indicates non-optimality of the method, and when a constraint is satisfied only for a certain range of parameter values, its performance tends to be poor when the parameter is out of the range. In general, we find that the empirical performance of a retrieval formula is tightly related to how well it satisfies these constraints. Thus the proposed constraints provide a good explanation of many empirical observations and make it possible to evaluate any existing or new retrieval formula analytically.