Latent variable modeling Research Papers (original) (raw)

This study aimed to identify subgroups with distinct trajectories of functional (HAQ) progression over 10 years following diagnosis of RA and identify baseline characteristics associated with the trajectories and their prognostic value... more

This study aimed to identify subgroups with distinct trajectories of functional (HAQ) progression over 10 years following diagnosis of RA and identify baseline characteristics associated with the trajectories and their prognostic value for mortality. Between 1986 and 1998, 1460 patients with RA symptoms <2 years and prior to disease-modifying treatment (DMARDs) were recruited to an inception cohort (Early RA Study). Standard clinical, functional and laboratory assessments were performed at presentation and annually. Deaths were tracked by the National Health Service Central Register. Growth mixture modelling was used to identify distinct trajectories of HAQ score progression and survival analysis employed to compare all-cause mortality across the trajectory classes. Four HAQ score progression classes were identified: moderate increasing (46%), low stable (6%), moderate stable (28%) and high stable (20%). Only the moderate-increasing class exhibited an accelerated decline in funct...

Testing for measurement invariance can be done within the context of multigroup latent class analysis. Latent class analysis can model any type of discrete level data, which makes it an obvious choice when nominal indicators are used or... more

Testing for measurement invariance can be done within the context of multigroup latent class analysis. Latent class analysis can model any type of discrete level data, which makes it an obvious choice when nominal indicators are used or when a researcher's aim is at classifying respondents in latent classes. The multigroup latent class (LC) model can be specified in three different ways, i.e. by adopting a probabilistic, a log-linear or a logistic parameterization. We define and compare these different forms of parameterization. The starting point is the standard LC model in which indicators and latent variables are defined at the nominal level. Additionally, we focus on LC models with ordinal indicators as well as LC factor models with ordinal indicators. Testing for measurement invariance involves estimating LC models with different degrees of homogeneity. We explain the procedure for investigating measurement invariance at both the scale as well as the item level. We illustrate the approach with two examples. The first example is a multigroup LC analysis with nominal indicators; the second a multigroup LC factor analysis with ordinal indicators.

This article presents a combined motivational and volitional intervention based on the theory of planned behavior aimed at promoting expansion-oriented job crafting behaviors. Participants were employees working in different companies,... more

This article presents a combined motivational and volitional intervention based on the theory of planned behavior aimed at promoting expansion-oriented job crafting behaviors. Participants were employees working in different companies, assigned to either an intervention (n = 53) or a control group (n = 55). Results of a field study (including premeasure, postmeasure, and weekly diaries) indicated that the intervention enhanced participants’ perceptions of behavioral control referred to job crafting and awareness regarding others’ engagement in job crafting. Latent change growth modeling showed that participation in the intervention led to participants shaping their job crafting intentions during the weeks, which translated into more frequent job crafting behaviors at the end of the study period. Besides, the intervention served to trigger weekly work-related flow experiences in terms of high absorption while working. Findings suggest that job crafting interventions can benefit from the inclusion of self-regulatory strategies complementing goal setting.

Latent class analysis (LCA) and latent profile analysis (LPA) are powerful techniques that enable researchers to glean insights into “hidden” psychological experiences to create typologies and profiles to provide better-informed... more

Latent class analysis (LCA) and latent profile analysis (LPA) are powerful techniques that enable researchers to glean insights into “hidden” psychological experiences to create typologies and profiles to provide better-informed community-based policies and practice. These analytic methods have been used in a variety of domains, such as: psychosis symptomatology in the general population (Kibowski & Williams, 2012; Murphy, Shevlin, & Adamson, 2007; Shevlin, Murphy, Dorahy, & Adamson, 2007); substance abuse (Cleveland, Collins, Lanza, Greenberg, & Feinberg, 2010; James, McField, & Montgomery, 2013), peer victimization (Nylund, Bellmore, Nishina, & Graham, 2007), and anti-social/self-defeating behavior (Rosato & Baer, 2010). LCA and LPA are versatile methods of dealing with data of interest to community-based researchers in a deep and psychologically grounded way. This chapter will address the nuances of how and when to use LCA and LPA. Case studies of LCA and LPA will also be presented to illustrate the applicability of these techniques.

In this invited article, Cabrera-Nguyen provides guidelines for reporting scale development and validation results. Authors' attention to these guidelines will help ensure the research reported in JSSWR is rigorous and of high quality.... more

In this invited article, Cabrera-Nguyen provides guidelines for reporting scale development and validation results. Authors' attention to these guidelines will help ensure the research reported in JSSWR is rigorous and of high quality. This article provides guidance for those using exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). In addition, the article provides helpful links to resources addressing structural equation modeling, multiple imputation for missing data, and a general resource for quantitative data analysis.

A dissertation combing both quantitative and qualitative analyses to describe the Old Assyrian social networks, the remains of a Middle Bronze Age (IIa) merchant colony with surviving documents from 1970 to 1720 B.C.E. including a... more

A dissertation combing both quantitative and qualitative analyses to describe the Old Assyrian social networks, the remains of a Middle Bronze Age (IIa) merchant colony with surviving documents from 1970 to 1720 B.C.E. including a hierarchical social reconstruction, chronological familial genealogies, and social network analysis of ca. 6,000 Old Assyrian texts from Kültepe, Turkey. The initial western discovery and subsequent excavations in the early 20th century and onwards yielded thousands of cuneiform tablets. By exploiting the names of the individual merchants and their relationships to each other, as recorded on the tablets, we demonstrate how to generate quantitative and qualitative social networks in order to populate them with hierarchical rank distributions reflecting the merchants’ age and seniority in the network, the results of a latent variable model described herein. A method for homonym disambiguation is delineated, and applied for the purpose of assessing the age of cohort groups for demographic analysis. We conclude that grounding the primary sources in a structural networked setting will allow for a closer examination of the textual narratives around events occurring in the trade-network scalable from multiple vantage points, from a single merchant’s household to the broadest extent of the trade networks, stretching 1,000 km between the city-state of Assur and this central trade hub in Anatolia, known as Kanesh.

A common situation in the evaluation of intervention programs is the researcher’s possibility to rely on two waves of data only (i.e., pretest and posttest), which profoundly impacts on his/her choice about the possible statistical... more

A common situation in the evaluation of intervention programs is the researcher’s possibility to rely on two waves of data only (i.e., pretest and posttest), which profoundly impacts on his/her choice about the possible statistical analyses to be conducted. Indeed, the evaluation of intervention programs based on a pretest-posttest design has been usually carried out by using classic statistical tests, such as family-wise ANOVA analyses, which are strongly limited by exclusively analyzing the intervention effects at the group level. In this article, we showed how second order multiple group latent change modeling (SO-MG-LCM) could represent a useful methodological tool to have a more realistic and informative assessment of intervention programs with two waves of data. We offered a practical step-by-step guide to properly implement this methodology, and we outlined the advantages of the LCM approach over classic ANOVA analyses. Furthermore, we also provided a real-data example by re-analyzing the implementation of the Young Prosocial Animation, a universal intervention program aimed at promoting prosociality among youth. In conclusion, albeit there are previous studies that pointed to the usefulness of MG-LCM to evaluate intervention programs (Curran & Muthén, 1999; Muthén & Curran, 1997), no previous study showed that it is possible to use this approach even in pretest-posttest (i.e., with only two time points) designs. Given the advantages of latent variable analyses in examining differences in interindividual and intraindividual changes (McArdle, 2009), the methodological and substantive implications of our proposed approach are discussed.

Study of multivariate data in situations where a variable of interest is unobservable (latent) and only measured indirectly is widely applied. Item response models are powerful tools for measurement and have been extended to incorporate... more

Study of multivariate data in situations where a variable of interest
is unobservable (latent) and only measured indirectly is widely applied. Item
response models are powerful tools for measurement and have been extended
to incorporate latent structure. The (loglinear) Rasch model is a simple item
response model where tests of fit and item parameters estimation can take place
without assumptions about the distribution of the latent variable. Inclusion of a
latent variable as predictor in standard regression models like logistic or Poisson
regression models is discussed, and a study of the relation between psychosocial
work environment and absence from work is used to illustrate and motivate the
results.

In this paper, we propose a Gaussian process (GP) model for analysis of nonlinear time series. Formulation of our model is based on the consideration that the observed data are functions of latent variables, with the associated mapping... more

In this paper, we propose a Gaussian process (GP) model for analysis of nonlinear time series. Formulation of our model is based on the consideration that the observed data are functions of latent variables, with the associated mapping between observations and latent representations modeled through GP priors. In addition, to capture the temporal dynamics in the modeled data, we assume that subsequent latent representations depend on each other on the basis of a hidden Markov prior imposed over them. Derivation of our model is performed by marginalizing out the model parameters in closed form using GP priors for observation mappings, and appropriate stick- breaking priors for the latent variable (Markovian) dynamics. This way, we eventually obtain a nonparametric Bayesian model for dynamical systems that accounts for uncertainty in the modeled data. We provide efficient inference algorithms for our model on the basis of a truncated variational Bayesian approximation. We demonstrate the efficacy of our approach considering a number of applications dealing with real-world data, and compare it with the related state-of-the-art approaches.

Background: We aimed to identify different categorical phenotypes based upon the DSM-V criteria of alcohol use disorders (AUD) among alcohol users who had at least one drink per week in the past year (n = 948). Methods: Data are from the... more

Background: We aimed to identify different categorical phenotypes based upon the DSM-V criteria of alcohol use disorders (AUD) among alcohol users who had at least one drink per week in the past year (n = 948).
Methods: Data are from the São Paulo Megacity Mental Health Survey collected in 2005–2007, as part of the World Mental Health Survey Initiative. A latent class analysis of the 11 DSM-5-AUD criteria was performed using Mplus, taking into account complex survey design features. Weighted logistic regression models were used to examine demographic correlates of the DSM-5-AUD latent classes.
Results: The best latent-class model was a three-class model. We found a “non-symptomatic class” (69.7%), a “use in larger amounts class” (23.2%), defined by high probability (>70%) of the “use in larger amounts” criterion only, and a “high-moderate symptomatic class” (7.1%), defined by high-moderate probability of all the 11 AUD criteria. Compared to those in the non-symptomatic class, individuals in the “high- moderate symptomatic class” were more likely to have been married, have lower educational attainment and to be unemployed or in non-regular/informal employment. Those on the “use in larger amounts class” were more likely to have been married or never married.
Conclusion: The two symptomatic classes clearly represented the dimensionality of the new proposed AUD criteria, and could be more specifically targeted by different prevention or treatment strategies. DSM-5-AUD has the advantage of shedding light on risky drinkers included in the “use in larger amounts class”, allowing for preventive interventions, which will reach a large number of individuals.

Simultaneous latent-class analysis across groups (SLCAG) is an extension of the standard latent class (LC) model for the examination of measurement equivalence/invariance. It can be used to compare the latent structure derived from a set... more

Simultaneous latent-class analysis across groups (SLCAG) is an extension of the standard latent class (LC) model for the examination of measurement equivalence/invariance. It can be used to compare the latent structure derived from a set of discrete item responses between multiple groups, e.g., between males and females, between Japanese, Americans and Dutch, young and old, and between ill and healthy. Whereas the more commonly used multigroup confirmatory factor analysis (MCFA) assumes that the underlying latent variables are continuous, SLCAG treats the latent variables either as nominal – e.g. to identify a typological classification from a given set of categorical indicators – or as ordinal – e.g. to investigate the scalability of a set of categorical indicators. These two specifications are sometimes referred as LC cluster and LC factor models, respectively.

The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning... more

The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning process. Besides, the utilization of a diversity maintenance approach avoids overfitting on a subset of easy labels. In this paper, we propose a self-paced multi-label learning with diversity (SPMLD) which aims to cover diverse labels with respect to its learning pace. In addition, the proposed framework is applied to an efficient correlation-based multi-label method. The non-convex objective function is optimized by an extension of the block coordinate descent algorithm. Empirical evaluations on real-world datasets with different dimensions of features and labels imply the effectiveness of the proposed predictive model.

Research on emotional labor focuses on how employees utilize 2 main regulation strategies—surface acting (i.e., faking one’s felt emotions) and deep acting (i.e., attempting to feel required emotions)—to adhere to emotional expectations... more

Research on emotional labor focuses on how employees utilize 2 main regulation strategies—surface acting (i.e., faking one’s felt emotions) and deep acting (i.e., attempting to feel required emotions)—to adhere to emotional expectations of their jobs. To date, researchers largely have considered how each strategy functions to predict outcomes in isolation. However, this variable-centered perspective ignores the possibility that there are subpopulations of employees who may differ in their combined use of surface and deep acting. To address this issue, we conducted 2 studies that examined surface acting and deep acting from a person-centered perspective. Using latent profile analysis, we identified 5 emotional labor profiles—non-actors, low actors, surface actors, deep actors, and regulators—and found that these actor profiles were distinguished by several emotional labor antecedents (positive affectivity, negative affectivity, display rules, customer orientation, and emotion demands–abilities fit) and differentially predicted employee outcomes (emotional exhaustion, job satisfaction, and felt inauthenticity). Our results reveal new insights into the nature of emotion regulation in emotional labor contexts and how different employees may characteristically use distinct combinations of emotion regulation strategies to manage their emotional expressions at work.

Being part of a delinquent group has been shown to facilitate the expression of an individual's own delinquent propensities. However, this facilitation effect has not been investigated from a developmental perspective within a population... more

Being part of a delinquent group has been shown to facilitate the expression of an individual's own delinquent propensities. However, this facilitation effect has not been investigated from a developmental perspective within a population heterogeneity model. Using a semiparametric mixture model with data from the Montreal Longitudinal Experimental Study, this article addresses important issues in the developmental trends of membership to delinquent groups. We explore how the rate of violent behaviors follows delinquent peer group trajectories and investigate a differential facilitation effect of delinquent peers on violence across multiple developmental pathways. Results suggest that 25% of males followed a childhood or an adolescence delinquent group affiliation trajectory. These two groups account far most of the violent acts assessed during adolescence. In addition, the rate of violent behaviors follows these developmental trajectories. Controlling for these delinquent group trajectories, we also found that being involved in a delinquent group at any specific time during adolescence is associated with an increased rate of violent behaviors, and that leaving these groups results in a decrease in violent behaviors. This facilitation effect appears homogeneous over time and across developmental trajectories. Results are discussed from a social interactional perspective.

Correlation filters are special classifiers designed for shift-invariant object recognition, which are robust to pattern distortions. The recent literature shows that combining a set of sub-filters trained based on a single or a small... more

Correlation filters are special classifiers designed for shift-invariant object recognition, which are robust to pattern distortions. The recent literature shows that combining a set of sub-filters trained based on a single or a small group of images obtains the best performance. The idea is equivalent to estimating variable distribution based on the data sampling (bagging), which can be interpreted as finding solutions (variable distribution approximation) directly from sampled data space. However, this methodology fails to account for the variations existed in the data. In this paper, we introduce an intermediate step – solution sampling – after the data sampling step to form a subspace, in which an optimal solution can be estimated. More specifically, we propose a new method, named latent constrained correlation filters (LCCF), by mapping the correlation filters to a given latent subspace, and develop a new learning framework in the latent subspace that embeds distribution-related constraints into the original problem. To solve the optimization problem, we introduce a subspace based alternating direction method of multipliers (SADMM), which is proven to converge at the saddle point. Our approach is successfully applied to three different tasks, including eye localization, car detection and object tracking. Extensive experiments demonstrate that LCCF outperforms the state-of-the-art methods. 1

A Monte Carlo approach was used to examine bias in the estimation of indirect effects and their associated standard errors. In the simulation design, (a) sample size, (b) the level of nonnormality characterizing the data, (c) the... more

A Monte Carlo approach was used to examine bias in the estimation of indirect effects and their associated standard errors. In the simulation design, (a) sample size, (b) the level of nonnormality characterizing the data, (c) the population values of the model parameters, and (d) the type of estimator were systematically varied. Estimates of model parameters were generally unaffected by either nonnormality or small sample size. Under severely nonnormal conditions, normal theory maximum likelihood estimates of the standard error of the mediated effect exhibited less bias (approximately 10% to 20% too small) compared to the standard errors of the structural regression coefficients (20% to 45% too small). Asymptotically distribution free standard errors of both the mediated effect and the structural parameters were substantially affected by sample size, but not nonnormality. Robust standard errors consistently yielded the most accurate estimates of sampling variability.

Les fouilles récentes ont révélé la topographie des sites agglomérés de la région toulousaine aux IIe et Ier s. av. n. ère. Pour la première fois a été mise en évidence la trame urbaine de l'oppidum de Vieille-Toulouse, trame apparue... more

Les fouilles récentes ont révélé la topographie des sites agglomérés de la région toulousaine aux IIe et Ier s. av. n. ère. Pour la première fois a été mise en évidence la trame urbaine de l'oppidum de Vieille-Toulouse, trame apparue entre 150 et 125 et maintenue, avec des ajustements, jusqu'à l'abandon du site vers - 10. Dans la plaine à Saint-Roch, l'occupation gauloise couvre une surface de l'ordre de 80 ha, sans structuration de grande ampleur. Il s'agit d'une agglomération secondaire sans doute dédiée à des activités spécifiques liées à la Garonne.

The cross-classified multiple membership latent variable regression (CCMM-LVR) model is a recent extension to the three-level latent variable regression (HM3-LVR) model which can be utilized for longitudinal data that contains individuals... more

The cross-classified multiple membership latent variable regression (CCMM-LVR) model is a recent extension to the three-level latent variable regression (HM3-LVR) model which can be utilized for longitudinal data that contains individuals who changed clusters over time (for instance, student mobility across schools). The HM3-LVR model can include the initial status on growth effect as varying across those clusters and allows testing of more flexible hypotheses about the influence of initial status on growth and of factors that might impact that relationship, but only in the presence of pure clustering of participants within higher-level units. This Monte Carlo study was conducted to evaluate model estimation under a variety of conditions and to measure the impact of ignoring cross-classified data when estimating the incorrectly specified HM3-LVR model in a scenario in which true values for parameters are known. Furthermore, results from a real-data analysis were used to inform the design of the simulation. Overall, it would be recommended for researchers to utilize the CCMM-LVR model over the HM3-LVR model when individuals are cross-classified, and to use a bare minimum of more than 100 clustering units in order to avoid overestimation of the level-3 variance component estimates.

On the one hand, the factors Gf and Gc in the Cattell–Horn–Carroll (CHC) model of intelligence are hypothesized to represent individual differences in unique psychological or biological capacities. On the other hand, they are interpreted... more

On the one hand, the factors Gf and Gc in the Cattell–Horn–Carroll (CHC) model of intelligence are hypothesized to represent individual differences in unique psychological or biological capacities. On the other hand, they are interpreted as representing the theoretical variables fluid and crystallized intelligence in investment theory. With respect to Gc, this leads to a theoretical conflict because in investment theory crystallized intelligence is not a capacity but purely a statistical entity. We contend that if CHC factor Gc represents a capacity, it cannot represent crystallized intelligence, and if Gc represents crystallized intelligence, it does not represent a capacity. In addition, from our discussion of Gc, we conclude that in investment theory the factors Gf and g represent one and the same capacity. In support of our theoretical conclusions, we implemented the CHC model in a confirmatory factor analysis of a HCA (Human Cognitive Abilities project) data set. As expected, Gf and g were statistically indistinguishable. Gc was effectively absent, because it was statistically equivalent to verbal comprehension. Factors Gc and g could be removed from the model without any reduction in model fit. We argue that in the CHC taxonomy the factors Gc and g are redundant as explanatory variables.► We discuss the theoretical status of the (CHC) factors Gc, Gf and g. ► If factor Gc represents crystallized intelligence it does not represent a capacity. ► If factor Gc represents a capacity it cannot represent crystallized intelligence. ► Gf and g represent the same (reasoning) capacity. ► Factors Gc and g can be removed from the CHC model as explanatory variables.

Le présent article est une exemplification méthodologique de la méthode LMS (Latent Moderated Structural Equations) disponible dans le logiciel Mplus. Des données recueillies pour étudier la motivation d’adolescentes (n = 434) en... more

Le présent article est une exemplification méthodologique de la méthode LMS (Latent Moderated Structural Equations) disponible dans le logiciel Mplus. Des données recueillies pour étudier la motivation d’adolescentes (n = 434) en éducation physique serviront à présenter la méthodologie à suivre pour évaluer l’interaction de variables latentes dans des modèles d’équations structurelles. Le texte focalise sur la compréhension générale du lecteur quant à l’application de cette méthode et un accent est mis sur la présentation et l’interprétation des résultats. En terminant, les avantages de la méthode LMS sont mis de l’avant et des pistes d’exemplifications méthodologiques sont proposées.

This study aimed to identify subgroups with distinct trajectories of functional (HAQ) progression over 10 years following diagnosis of RA and identify baseline characteristics associated with the trajectories and their prognostic value... more

This study aimed to identify subgroups with distinct trajectories of functional (HAQ) progression over 10 years following diagnosis of RA and identify baseline characteristics associated with the trajectories and their prognostic value for mortality. Between 1986 and 1998, 1460 patients with RA symptoms <2 years and prior to disease-modifying treatment (DMARDs) were recruited to an inception cohort (Early RA Study). Standard clinical, functional and laboratory assessments were performed at presentation and annually. Deaths were tracked by the National Health Service Central Register. Growth mixture modelling was used to identify distinct trajectories of HAQ score progression and survival analysis employed to compare all-cause mortality across the trajectory classes. Four HAQ score progression classes were identified: moderate increasing (46%), low stable (6%), moderate stable (28%) and high stable (20%). Only the moderate-increasing class exhibited an accelerated decline in funct...

Within the Generalized Linear Latent Variable Models context (GLVM; Moustaki and Knott 2000) we discuss the implementation of Bayesian measures of model complexity such as the Bayes Factor (BF; Kass and Raftery, 1995). Patz and Juncker... more

Within the Generalized Linear Latent Variable Models context (GLVM; Moustaki and Knott 2000) we discuss the implementation of Bayesian measures of model complexity such as the Bayes Factor (BF; Kass and Raftery, 1995).
Patz and Juncker (1999) initially proposed a Bayesian approach regarding the estimation of the parameters of a latent variable model with categorical responses. A-priori distributions are assigned to the model parameters as well as to the latent vector. A Markov chain, whose stationary distribution is the required posterior distribution P(α,β,z|x), is simulated via a Metropolis-Hastings within Gibbs algorithm (Chib and Greenberg, 1995). After a sufficiently log run of the chain, inference can be made about each parameter. We expand this work addressing the problem of approximating the marginal likelihood, over all parameters and for each competing model, involved in the calculation of the BF. Five methods proposed in the Bayesian literature are applied in the GLVM, namely:Harmonic mean estimator (Raftery et al, 2007), Importance sampling estimator (Newton and Raftery, 1994), Laplace estimator (Lewis and Raftery, 1997), Chib and Jeliazkov estimator (Chib and Jeliazkov, 2001), Power posterior estimator (Friel and Pettit, 2008).A comparison with respect to the accuracy and computational complexity of estimators is illustrated.

Trajectories of prosocial behavior and physical aggression between 6 and 12 years of age were identified for a sample (N=1,025) of males. The trajectories were then used to predict school dropout and physical violence at age 17. Using a... more

Trajectories of prosocial behavior and physical aggression between 6 and 12 years of age were identified for a sample (N=1,025) of males. The trajectories were then used to predict school dropout and physical violence at age 17. Using a group-based semi-parametric method, two trajectories of prosociality (low and moderate declining) and three trajectories of physical aggression (low, moderate, and high declining) were obtained. Only a small minority (3.4%) of the boys were characterized by both high aggression and moderate prosociality. Physical aggression predicted both school dropout and physical violence, but contrary to expectations, prosocial behavior did not have additive or protective effects.

A longitudinal study with a nested preventive intervention was used to test five hypotheses generated from developmental theories of antisocial behavior. The longitudinal study followed 909 boys from their kindergarten year up to 17 years... more

A longitudinal study with a nested preventive intervention was used to test five hypotheses generated from developmental theories of antisocial behavior. The longitudinal study followed 909 boys from their kindergarten year up to 17 years of age. The randomized multimodal preventive intervention targeted a subsample of boys who were rated disruptive by their kindergarten teacher. Semiparametric analyses of developmental trajectories for self-reported physical aggression, vandalism, and theft identified more types of trajectories than expected from recent theoretical models. Also, these trajectories did not confirm theoretical models, which suggest a general increase of antisocial behavior during adolescence. The majority of boys were on either a low-level antisocial behavior trajectory or a declining trajectory. Less than 6% appeared to follow a trajectory of chronic antisocial behavior. Comparisons between disruptive and nondisruptive kindergarten boys confirmed the hypothesis that disruptive preschool children are at higher risk of following trajectories of frequent antisocial behavior. Comparisons between treated and untreated disruptive boys confirmed that an intensive preventive intervention between 7 and 9 years of age, which included parent training and social skills training, could change the long-term developmental trajectories of physical aggression, vandalism, and theft for disruptive kindergarten boys in low socioeconomic areas. The results suggest that trajectories of violent behavior can be deflected by interventions that do not specifically target the physiological deficits that are often hypothesized to be a causal factor. The value of longitudinal-experimental studies from early childhood onward is discussed.

A structural equation modelling (SEM) technique should be used when models include both latent and manifest variables, and require the simultaneous estimation of several regression equations, but that is not always the case in previous... more

A structural equation modelling (SEM) technique should be used when models include both latent and manifest variables, and require the simultaneous estimation of several regression equations, but that is not always the case in previous studies. In addition, although the comparison of one's model to alternative models is a fundamental practice in data analysis, it is not used as a rule in extant literature. Moreover, in SEM, the estimation and testing methods are based on asymptotic theory and the validity of the parameter estimates and test statistics depends on large samples. However, most research in this area is based on relatively small samples. The focus of this research is to provide an example of how models including both latent and manifest variables should be tested, employing effective, solid, and rigorous methodological procedures. To this end, this study tests a model of the effects of relationship marketing antecedents and mediators on objective performance, from a cross-validation, competing models approach. The empirical work draws on the perceptions of 5,368 firm representatives about their relationships with their counterparts in hotels. SEM results suggest that commitment is the strongest determinant of share of business and mediates the impact of relationship quality on objective performance.

Self-regulated learning (SRL) is a process that highly fluctuates as students actively deploy their metacognitive and cognitive processes during learning. In this paper, we apply an extension of latent profiling, latent transition... more

Self-regulated learning (SRL) is a process that highly fluctuates as students actively deploy their metacognitive and cognitive processes during learning. In this paper, we apply an extension of latent profiling, latent transition analysis (LTA), which investigates the longitudinal development of students' SRL latent class memberships over time. We will briefly review the theoretical foundations of SRL and discuss the value of using LTA to investigate this multidimensional concept. This study is based on college students (n = 75) learning about the human circulatory system while using MetaTutor, an intelligent tutoring system that adaptively supports SRL and targets specific metacognitive SRL processes including judgment of learning (JOL) and content evaluation (CE). Preliminary results identify transitional probabilities of SRL profiles from four distinct events associated with the use of SRL.

Introduction: Mild cognitive impairment (MCI) has clinical value in its ability to predict later de-mentia. A better understanding of cognitive profiles can further help delineate who is most at risk of conversion to dementia. We aimed to... more

Introduction: Mild cognitive impairment (MCI) has clinical value in its ability to predict later de-mentia. A better understanding of cognitive profiles can further help delineate who is most at risk of conversion to dementia. We aimed to (1) examine to what extent the usual MCI subtyping using core criteria corresponds to empirically defined clusters of patients (latent profile analysis [LPA] of continuous neuropsychological data) and (2) compare the two methods of subtyping memory clinic participants in their prediction of conversion to dementia. Methods: Memory clinic participants (MCI, n 5 139) and age-matched controls (n 5 98) were recruited. Participants had a full cognitive assessment, and results were grouped (1) according to traditional MCI subtypes and (2) using LPA. MCI participants were followed over approximately 2 years after their initial assessment to monitor for conversion to dementia. Results: Groups were well matched for age and education. Controls performed significantly better than MCI participants on all cognitive measures. With the traditional analysis, most MCI participants were in the amnestic multidomain subgroup (46.8%) and this group was most at risk of conversion to dementia (63%). From the LPA, a three-profile solution fit the data best. Profile 3 was the largest group (40.3%), the most cognitively impaired, and most at risk of conversion to dementia (68% of the group). Discussion: LPA provides a useful adjunct in delineating MCI participants most at risk of conversion to dementia and adds confidence to standard categories of clinical inference.

Copyright 2002 by Constantine Iliopoulos. All rights reserved. Readers may make verbatim copies of this document for non-commercial purposes by any means, provided that this copyright notice appears on all such copies. ... Long-term... more

Copyright 2002 by Constantine Iliopoulos. All rights reserved. Readers may make verbatim copies of this document for non-commercial purposes by any means, provided that this copyright notice appears on all such copies. ... Long-term Financing in European Agribusiness Co-...

O objetivo deste artigo é o desenvolvimento de um modelo de mensuração dos sentimentos de legitimidade policial da população paulistana. Identificamos que, na literatura nacional a respeito das atitudes públicas sobre as autoridades... more

O objetivo deste artigo é o desenvolvimento de um modelo de mensuração dos sentimentos de legitimidade policial da população paulistana. Identificamos que, na literatura nacional a respeito das atitudes públicas sobre as autoridades legais, pesquisas que se valem de desenhos de survey comumente adotam uma perspectiva operacionalista, negligenciando a lacuna existente entre os conceitos e as questões. Contribuímos substantivamente para esse campo desenvolvendo um modelo de mensuração de legitimidade policial, definida conceitualmente como um dever normativamente orientado de obedecer e um alinhamento moral com a polícia. Utilizando dados de um survey representativo de oito regiões da cidade de São Paulo em 2017, realizamos uma análise fatorial confirmatória de segunda ordem. Posteriormente,validamos essa medida por meio de um modelo de equações estruturais centrado no construto legitimidade. Implicações do conceito e da mensuração de legitimidade para a segurança pública são discutidas.

The network approach to psychopathology is becoming increasingly popular. The motivation for this approach is to provide a replacement for the problematic common cause perspective and the associated latent variable model, where symptoms... more

The network approach to psychopathology is becoming increasingly popular. The motivation for this approach is to provide a replacement for the problematic common cause perspective and the associated latent variable model, where symptoms are taken to be mere effects of a common cause (the disorder itself). The idea is that the latent variable model is plausible for medical diseases, but unrealistic for mental disorders, which should rather be conceptualized as networks of directly interacting symptoms. We argue that this rationale for the network approach is misguided. Latent variable (or common cause) models are not inherently problematic, and there is not even a clear boundary where network models end and latent variable (or common cause) models begin. We also argue that focusing on this contrast has led to an unrealistic view of testing and finding support for the network approach, as well as an oversimplified picture of the relationship between medical diseases and mental disorders. As an alternative, we point out more essential contrasts, such as the contrast between dynamic and static modeling approaches, that can provide a better framework for conceptualizing mental disorders. Finally, we discuss several topics and open problems that need to be addressed in order to make the network approach more concrete and to move the field of psychological network research forward.

CONTEXT: Deviant peer group involvement is strongly related to onset, aggravation, and persistence of conduct problems during adolescence. OBJECTIVE: To identify early childhood behavioral profiles that predict early-onset deviant peer... more

CONTEXT:
Deviant peer group involvement is strongly related to onset, aggravation, and persistence of conduct problems during adolescence.
OBJECTIVE:
To identify early childhood behavioral profiles that predict early-onset deviant peer group involvement.

Objective: To identify profiles of emotional and behavioral symptoms from early childhood to adolescence, their stability across development and associated factors. Method: Our sample included 17,216 children assessed at ages 3, 5, 7, 11,... more

Objective: To identify profiles of emotional and behavioral symptoms from early childhood to adolescence, their stability across development and associated factors. Method: Our sample included 17,216 children assessed at ages 3, 5, 7, 11, and 14 from the UK Millennium Cohort Study. We used latent profile and latent transition analysis to study their emotional and behavioral profiles from early childhood to adolescence. We included sociodemographic, family, and parenting variables to study the effect on the latent profile's membership and transitions. Results: The number and specific profiles of emotional and behavioral symptoms change with the developmental stage. We found a higher number of profiles for ages 3, 5, and 14, which suggests greater heterogeneity in the presentation of emotional and behavioral symptoms for early childhood and adolescence compared to late childhood. There was greater heterotypic continuity between ages 3 and 5, particularly in transitions from higher...

This paper aims at classifying, on the basis of their disability profile, the population of elderly and quantifying the number of those with a very low level of functioning in a central region of Italy. This is accomplished using a set of... more

This paper aims at classifying, on the basis of their disability profile, the population of elderly and quantifying the number of those with a very low level of functioning in a central region of Italy. This is accomplished using a set of variables on the difficulty of accomplishing everyday tasks (Activities of Daily Living, ADL) and functions. This issue is