Antonio Carlos Rueda - Academia.edu (original) (raw)
Papers by Antonio Carlos Rueda
Il Nuovo Cimento A, 1978
Zero-Point Field Summary.-It is proposed that an agent responsible for the acceleration of cosmic... more Zero-Point Field Summary.-It is proposed that an agent responsible for the acceleration of cosmic-ray particles, and in particular for the acceleration of cosmic-ray protons, is the zero-point universal background radiation. This proposition is based first on a classical model for polarizable particles interacting with a random electromagnetic field, due to Einstein and gopf, and second on the conjecture that some cosmic-ray particles and ill particular protons, due to their inner structure, can be polarized and respond like linear oscillators. An energy spectrum for cosmic-ray protons is derived and the braking effect of the microwave background as well as collisions with interstellar or intergalactic gas particles are discussed. Numerical fittings and simple predictions are presented. The analysis is carried classically in the manner of random or stochastic electrodynamics. 1.-Introduction. Ordinary classical clectrodynamics assigns a zero value to the potentials and fields that are the homogeneous part of the solution to Maxwe]l's equations. If instead of this null homogeneous solution we take a homogeneous boundary solution that represents u randomly fluctuating electromagnetic field with a Lorentz-invariant (1,~.) spectrum, we have what ordinarily is called (*) Research supported by Colcieneias.
Methods in molecular biology (Clifton, N.J.), 2017
High-throughput sequencing (HTS) data for small RNAs (noncoding RNA molecules that are 20-250 nuc... more High-throughput sequencing (HTS) data for small RNAs (noncoding RNA molecules that are 20-250 nucleotides in length) can now be routinely generated by minimally equipped wet laboratories; however, the bottleneck in HTS-based research has shifted now to the analysis of such huge amount of data. One of the reasons is that many analysis types require a Linux environment but computers, system administrators, and bioinformaticians suppose additional costs that often cannot be afforded by small to mid-sized groups or laboratories. Web servers are an alternative that can be used if the data is not subjected to privacy issues (what very often is an important issue with medical data). However, in any case they are less flexible than stand-alone programs limiting the number of workflows and analysis types that can be carried out.We show in this protocol how virtual machines can be used to overcome those problems and limitations. sRNAtoolboxVM is a virtual machine that can be executed on all c...
Periferica, 2012
Los autores de este trabajo se proponen analizar el estatuto de la dimensión cultural en la unive... more Los autores de este trabajo se proponen analizar el estatuto de la dimensión cultural en la universidad teniendo en cuenta su evolución histórica y su legitimidad normativa; por otra parte, también presentan los resultados de una encuesta sobre Responsabilidad Social Universitaria comparándola con otras fuentes de información vinculadas. The authors of this study set out to analyse the status of the cultural dimension of Spanish universities, considering its historical evolution and legal validity. They also present the results of a survey on University Social Responsibility, comparing these with other sources of related information.
Nucleic acids research, Jan 23, 2017
High-profile genomic variation projects like the 1000 Genomes project or the Exome Aggregation Co... more High-profile genomic variation projects like the 1000 Genomes project or the Exome Aggregation Consortium, are generating a wealth of human genomic variation knowledge which can be used as an essential reference for identifying disease-causing genotypes. However, accessing these data, contrasting the various studies and integrating those data in downstream analyses remains cumbersome. The Human Genome Variation Archive (HGVA) tackles these challenges and facilitates access to genomic data for key reference projects in a clean, fast and integrated fashion. HGVA provides an efficient and intuitive web-interface for easy data mining, a comprehensive RESTful API and client libraries in Python, Java and JavaScript for fast programmatic access to its knowledge base. HGVA calculates population frequencies for these projects and enriches their data with variant annotation provided by CellBase, a rich and fast annotation solution. HGVA serves as a proof-of-concept of the genome analysis deve...
Nucleic Acids Research, 2015
Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulato... more Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulators of gene expression, other types of functional small RNA molecules have been reported in animals and plants. MicroRNAs are important in host-microbe interactions and parasite microRNAs might modulate the innate immunity of the host. Furthermore, small RNAs can be detected in bodily fluids making them attractive non-invasive biomarker candidates. Given the general broad interest in small RNAs, and in particular microRNAs, a large number of bioinformatics aided analysis types are needed by the scientific community. To facilitate integrated sRNA research, we developed sRNAtoolbox, a set of independent but interconnected tools for expression profiling from high-throughput sequencing data, consensus differential expression, target gene prediction, visual exploration in a genome context as a function of read length, gene list analysis and blast search of unmapped reads. All tools can be used independently or for the exploration and downstream analysis of sRNAbench results. Workflows like the prediction of consensus target genes of parasite mi-croRNAs in the host followed by the detection of enriched pathways can be easily established. The webinterface interconnecting all these tools is available at http://bioinfo5.ugr.es/srnatoolbox
European Workshop on Computational Geometry, 2006
Visibility determination becomes especially signifi- cant in Computer Graphics for walkthrough ap... more Visibility determination becomes especially signifi- cant in Computer Graphics for walkthrough applica- tions. The aim is to determine those visible objects of the scene from the viewer position. Even when this problem is nowadays efficiently solved by the graph- ics hardware, when a real-time response is required, some additional CPU processing in the scene geome- try may be necessary to
Acta Oncologica, 2015
Background. Relapsed or refractory (R/R) classical Hodgkin lymphoma (cHL) after autologous stem c... more Background. Relapsed or refractory (R/R) classical Hodgkin lymphoma (cHL) after autologous stem cell transplantation (ASCT) remains a challenge. For these patients treatments with different mechanisms of action rather than classical chemotherapy are needed. Patients and methods. Patients with R/R cHL after ASCT were recruited in a phase II trial (EUDRA CT: 2009-016588-12). Lenalidomide was administered at 20 mg/day for 21 days and cyclophosphamide at 50 mg/day for 28 days (cycles every 28 days). Dose escalation for lenalidomide was permitted. In 2009 we considered that this treatment would be promising if response rate were over 60% and a Simon two-stage binomial design was used to calculate the sample size. A total of 46 patients were planned but the trial would be stopped if less than seven responses after four cycles were obtained in the fi rst 16 patients. Results. The trial was closed early because only fi ve responses were observed after four cycles in the fi rst 16 patients included. Median age was 34 years (18-77). The median number of previous lines was fi ve (2-6). At inclusion, 10 patients were primary refractory and 11 refractory to the last therapy. A total of 110 cycles were administered, with grade Ն 3 toxicity in 43 cycles (39%). One non-neutropenic patient developed septic shock resulting in death. An ORR of 38% (1 CR and 5 PR) was observed and a total of 10 patients (62%) achieved clinical benefi t. Median progression free survival and overall survival were seven and 19 months, respectively. With a median follow-up of 19 months (3-38 ϩ), three-year progression-free and overall survival were 6% and 31%, respectively. Conclusion. The optimistic assumptions of this trial led to an early closure. However, the promising clinical benefi t observed with the oral combination of lenalidomide and metronomic cyclophosphamide may justify its use for outpatient palliative treatment.
Proceedings of the 44th IEEE Conference on Decision and Control
In this paper an alternative approach to non-linear predictive control is presented. It is based ... more In this paper an alternative approach to non-linear predictive control is presented. It is based on iterative linearisation of the model response so that the same closed loop responses as in the pure non-linear approach are obtained but with reduced computation times and more efficient optimisation tools. The method is applied to a high purity distillation column and some results are presented showing the behaviour of the proposed algorithm.
Computer Graphics Forum, 2014
ABSTRACT The generation of a stereoscopic animation film requires doubling the rendering times an... more ABSTRACT The generation of a stereoscopic animation film requires doubling the rendering times and hence the cost. In this paper, we address this problem and propose an automatic system for generating a stereo pair from a given image and its depth map. Although several solutions exist in the literature, the high standards of image quality required in the context of a professional animation studio forced us to develop specially crafted algorithms that avoid artefacts caused by occlusions, anti-aliasing filters, etc. This paper describes all the algorithms involved in our system and provides their GPU implementation. The proposed system has been tested with real-life working scenarios. Our experiments show that the second view of the stereoscopic pair can be computed with as little as 15% of the effort of the original image while guaranteeing a similar quality.
Visual Computer, 2005
In this work we describe a decomposition scheme for polyhedra called layer-based decomposition. T... more In this work we describe a decomposition scheme for polyhedra called layer-based decomposition. This decomposition can be computed in a straightforward way for any kind of polyhedron: convex or nonconvex, genus 0 or higher, etc. and presents interesting properties and applications like point-in-polyhedron inclusion test, computation of Boolean operations, or 3D location. Two methods for computing this decomposition and several of its applications are described in detail, including experimental results and comparisons with alternative approaches.
Computational Biology and Chemistry, 2014
Early global measures of genome complexity (power spectra, the analysis of fluctuations in DNA wa... more Early global measures of genome complexity (power spectra, the analysis of fluctuations in DNA walks or compositional segmentation) uncovered a high degree of complexity in eukaryotic genome sequences. The main evolutionary mechanisms leading to increases in genome complexity (i.e. gene duplication and transposon proliferation) can all potentially produce increases in DNA clustering. To quantify such clustering and provide a genome-wide description of the formed clusters, we developed GenomeCluster, an algorithm able to detect clusters of whatever genome element identified by chromosome coordinates. We obtained a detailed description of clusters for ten categories of human genome elements, including functional (genes, exons, introns), regulatory (CpG islands, TFBSs, enhancers), variant (SNPs) and repeat (Alus, LINE1) elements, as well as DNase hypersensitivity sites. For each category, we located their clusters in the human genome, then quantifying cluster length and composition, and estimated the clustering level as the proportion of clustered genome elements. In average, we found a 27% of elements in clusters, although a considerable variation occurs among different categories. Genes form the lowest number of clusters, but these are the longest ones, both in bp and the average number of components, while the shortest clusters are formed by SNPs. Functional and regulatory elements (genes, CpG islands, TFBSs, enhancers) show the highest clustering level, as compared to DNase sites, repeats (Alus, LINE1) or SNPs. Many of the genome elements we analyzed are known to be composed of clusters of low-level entities. In addition, we found here that the clusters generated by GenomeCluster can be in turn clustered into highlevel super-clusters. The observation of 'clusters-within-clusters' parallels the 'domains within domains' phenomenon previously detected through global statistical methods in eukaryotic sequences, and reveals a complex human genome landscape dominated by hierarchical clustering.
Revista del Laboratorio Clínico, 2011
ICM 2000. Proceedings of the 12th International Conference on Microelectronics. (IEEE Cat. No.00EX453), 2000
Partial-product reduction circuits (compressors) are of capital importance in the design of high ... more Partial-product reduction circuits (compressors) are of capital importance in the design of high performance parallel multipliers. This paper proposes compressor designs based on threshold gates which have been implemented as vMOS circuits. A typical block, a (4,2) compressor, is fully developed. Data for a (6,2) compressor are also provided. Results show that such compressors have the best performance in delay
Proceedings. International Conference on Image Processing
... FUNCTION M. A. Roula, A. Bouridane, F. Kurugollu and A. Amira Queen&a... more ... FUNCTION M. A. Roula, A. Bouridane, F. Kurugollu and A. Amira Queen's University of Belfast, School of Computer Science 18 Malone Road BT7 INN Belfast, UK m.roula, a.bouridane, fkurugollu, a.amira @qub.ac.uk ABSTRACT ...
Therapeutics and Clinical Risk Management, 2011
Aggressive T cell lymphomas are a subgroup of lymphomas with a particularly poor prognosis. This ... more Aggressive T cell lymphomas are a subgroup of lymphomas with a particularly poor prognosis. This is especially true for patients with recurrent or refractory disease, who typically have limited response to salvage therapy and extremely poor overall survival. For this reason, there is a strong need to develop potentially active drugs for these malignancies. Pralatrexate is a novel antifolate designed to have high affinity for reduced folate carrier type 1. Preclinical and clinical studies have demonstrated that pralatrexate has significant activity against T cell lymphomas. The dose-limiting toxicity for pralatrexate is mucositis, which can be abrogated with folic acid and vitamin B12 supplementation. Pralatrexate is the first single agent approved for the treatment of patients with relapsed or refractory peripheral T cell lymphoma. This approval was based on an overall objective response rate observed in the pivotal study. The overall response rate was 29%, with a median duration of 10.1 months. This article reviews the biochemistry, preclinical experience, metabolism, and pharmacokinetics of pralatrexate, including the clinical experience with this agent in lymphoma. Future areas of development are now focused on identifying synergistic combinations of pralatrexate with other agents and the evaluation of predictive markers for clinical benefit.
The Visual Computer, 2007
Voxelization of solids, that is the representation of a solid by a set of voxels that approximate... more Voxelization of solids, that is the representation of a solid by a set of voxels that approximates it, is an operation with important applications in fields like solid modeling, physical simulation or volume graphics. Moreover, the new generation of affordable 3D raster displays has renewed the interest on fast voxelization algorithms, as the scan-conversion of a solid is a basic operation on these devices. In this paper a hardware accelerated method for computing a voxelization of a polyhedron is presented. The algorithm is simple, efficient, robust and handles any kind of polyhedron (self-intersecting, with or without holes, manifold or non-manifold). Three different implementations are described in detail. The first is a conventional implementation in the CPU, the second is a hardware accelerated implementation that uses standard OpenGL primitives, and the third exploits the capabilities of modern GPUs by using vertex programs.
The Visual Computer, 2009
This paper presents a scheme for decomposing polyhedra called multi-LREP. The scheme is based on ... more This paper presents a scheme for decomposing polyhedra called multi-LREP. The scheme is based on the L-REP decomposition, which classifies the triangular faces of a polyhedron into a set of layered tetrahedra. In the multi-LREP these layered tetrahedra are grouped into regions of a space subdivision. The paper also describes an efficient method for constructing the L-REP decomposition and how the multi-LREP can be applied to speed up two L-REP applications: the point-in-polyhedron inclusion test and the ray-scene intersection. An experimental comparison with other point-in-polyhedron tests is presented as well.
Robotics and Autonomous Systems, 2010
The Path Planning problem is a common topic for Robotics and Computational Geometry. Many importa... more The Path Planning problem is a common topic for Robotics and Computational Geometry. Many important results have been found to this classic problem, some of them based on plane or space tessellation. The new approach we propose in this paper computes a partition of the plane called the Polar Diagram, using angle properties as criterion of construction. Compared to some other plane partitions as Voronoi Diagrams, this tessellation can be computed much more efficiently for different geometric objects. The polar diagram used as preprocessing can be applied to many geometric problems where the solution can be given by angle processing, such as Visibility or Path Planning problems.
Il Nuovo Cimento A, 1978
Zero-Point Field Summary.-It is proposed that an agent responsible for the acceleration of cosmic... more Zero-Point Field Summary.-It is proposed that an agent responsible for the acceleration of cosmic-ray particles, and in particular for the acceleration of cosmic-ray protons, is the zero-point universal background radiation. This proposition is based first on a classical model for polarizable particles interacting with a random electromagnetic field, due to Einstein and gopf, and second on the conjecture that some cosmic-ray particles and ill particular protons, due to their inner structure, can be polarized and respond like linear oscillators. An energy spectrum for cosmic-ray protons is derived and the braking effect of the microwave background as well as collisions with interstellar or intergalactic gas particles are discussed. Numerical fittings and simple predictions are presented. The analysis is carried classically in the manner of random or stochastic electrodynamics. 1.-Introduction. Ordinary classical clectrodynamics assigns a zero value to the potentials and fields that are the homogeneous part of the solution to Maxwe]l's equations. If instead of this null homogeneous solution we take a homogeneous boundary solution that represents u randomly fluctuating electromagnetic field with a Lorentz-invariant (1,~.) spectrum, we have what ordinarily is called (*) Research supported by Colcieneias.
Methods in molecular biology (Clifton, N.J.), 2017
High-throughput sequencing (HTS) data for small RNAs (noncoding RNA molecules that are 20-250 nuc... more High-throughput sequencing (HTS) data for small RNAs (noncoding RNA molecules that are 20-250 nucleotides in length) can now be routinely generated by minimally equipped wet laboratories; however, the bottleneck in HTS-based research has shifted now to the analysis of such huge amount of data. One of the reasons is that many analysis types require a Linux environment but computers, system administrators, and bioinformaticians suppose additional costs that often cannot be afforded by small to mid-sized groups or laboratories. Web servers are an alternative that can be used if the data is not subjected to privacy issues (what very often is an important issue with medical data). However, in any case they are less flexible than stand-alone programs limiting the number of workflows and analysis types that can be carried out.We show in this protocol how virtual machines can be used to overcome those problems and limitations. sRNAtoolboxVM is a virtual machine that can be executed on all c...
Periferica, 2012
Los autores de este trabajo se proponen analizar el estatuto de la dimensión cultural en la unive... more Los autores de este trabajo se proponen analizar el estatuto de la dimensión cultural en la universidad teniendo en cuenta su evolución histórica y su legitimidad normativa; por otra parte, también presentan los resultados de una encuesta sobre Responsabilidad Social Universitaria comparándola con otras fuentes de información vinculadas. The authors of this study set out to analyse the status of the cultural dimension of Spanish universities, considering its historical evolution and legal validity. They also present the results of a survey on University Social Responsibility, comparing these with other sources of related information.
Nucleic acids research, Jan 23, 2017
High-profile genomic variation projects like the 1000 Genomes project or the Exome Aggregation Co... more High-profile genomic variation projects like the 1000 Genomes project or the Exome Aggregation Consortium, are generating a wealth of human genomic variation knowledge which can be used as an essential reference for identifying disease-causing genotypes. However, accessing these data, contrasting the various studies and integrating those data in downstream analyses remains cumbersome. The Human Genome Variation Archive (HGVA) tackles these challenges and facilitates access to genomic data for key reference projects in a clean, fast and integrated fashion. HGVA provides an efficient and intuitive web-interface for easy data mining, a comprehensive RESTful API and client libraries in Python, Java and JavaScript for fast programmatic access to its knowledge base. HGVA calculates population frequencies for these projects and enriches their data with variant annotation provided by CellBase, a rich and fast annotation solution. HGVA serves as a proof-of-concept of the genome analysis deve...
Nucleic Acids Research, 2015
Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulato... more Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulators of gene expression, other types of functional small RNA molecules have been reported in animals and plants. MicroRNAs are important in host-microbe interactions and parasite microRNAs might modulate the innate immunity of the host. Furthermore, small RNAs can be detected in bodily fluids making them attractive non-invasive biomarker candidates. Given the general broad interest in small RNAs, and in particular microRNAs, a large number of bioinformatics aided analysis types are needed by the scientific community. To facilitate integrated sRNA research, we developed sRNAtoolbox, a set of independent but interconnected tools for expression profiling from high-throughput sequencing data, consensus differential expression, target gene prediction, visual exploration in a genome context as a function of read length, gene list analysis and blast search of unmapped reads. All tools can be used independently or for the exploration and downstream analysis of sRNAbench results. Workflows like the prediction of consensus target genes of parasite mi-croRNAs in the host followed by the detection of enriched pathways can be easily established. The webinterface interconnecting all these tools is available at http://bioinfo5.ugr.es/srnatoolbox
European Workshop on Computational Geometry, 2006
Visibility determination becomes especially signifi- cant in Computer Graphics for walkthrough ap... more Visibility determination becomes especially signifi- cant in Computer Graphics for walkthrough applica- tions. The aim is to determine those visible objects of the scene from the viewer position. Even when this problem is nowadays efficiently solved by the graph- ics hardware, when a real-time response is required, some additional CPU processing in the scene geome- try may be necessary to
Acta Oncologica, 2015
Background. Relapsed or refractory (R/R) classical Hodgkin lymphoma (cHL) after autologous stem c... more Background. Relapsed or refractory (R/R) classical Hodgkin lymphoma (cHL) after autologous stem cell transplantation (ASCT) remains a challenge. For these patients treatments with different mechanisms of action rather than classical chemotherapy are needed. Patients and methods. Patients with R/R cHL after ASCT were recruited in a phase II trial (EUDRA CT: 2009-016588-12). Lenalidomide was administered at 20 mg/day for 21 days and cyclophosphamide at 50 mg/day for 28 days (cycles every 28 days). Dose escalation for lenalidomide was permitted. In 2009 we considered that this treatment would be promising if response rate were over 60% and a Simon two-stage binomial design was used to calculate the sample size. A total of 46 patients were planned but the trial would be stopped if less than seven responses after four cycles were obtained in the fi rst 16 patients. Results. The trial was closed early because only fi ve responses were observed after four cycles in the fi rst 16 patients included. Median age was 34 years (18-77). The median number of previous lines was fi ve (2-6). At inclusion, 10 patients were primary refractory and 11 refractory to the last therapy. A total of 110 cycles were administered, with grade Ն 3 toxicity in 43 cycles (39%). One non-neutropenic patient developed septic shock resulting in death. An ORR of 38% (1 CR and 5 PR) was observed and a total of 10 patients (62%) achieved clinical benefi t. Median progression free survival and overall survival were seven and 19 months, respectively. With a median follow-up of 19 months (3-38 ϩ), three-year progression-free and overall survival were 6% and 31%, respectively. Conclusion. The optimistic assumptions of this trial led to an early closure. However, the promising clinical benefi t observed with the oral combination of lenalidomide and metronomic cyclophosphamide may justify its use for outpatient palliative treatment.
Proceedings of the 44th IEEE Conference on Decision and Control
In this paper an alternative approach to non-linear predictive control is presented. It is based ... more In this paper an alternative approach to non-linear predictive control is presented. It is based on iterative linearisation of the model response so that the same closed loop responses as in the pure non-linear approach are obtained but with reduced computation times and more efficient optimisation tools. The method is applied to a high purity distillation column and some results are presented showing the behaviour of the proposed algorithm.
Computer Graphics Forum, 2014
ABSTRACT The generation of a stereoscopic animation film requires doubling the rendering times an... more ABSTRACT The generation of a stereoscopic animation film requires doubling the rendering times and hence the cost. In this paper, we address this problem and propose an automatic system for generating a stereo pair from a given image and its depth map. Although several solutions exist in the literature, the high standards of image quality required in the context of a professional animation studio forced us to develop specially crafted algorithms that avoid artefacts caused by occlusions, anti-aliasing filters, etc. This paper describes all the algorithms involved in our system and provides their GPU implementation. The proposed system has been tested with real-life working scenarios. Our experiments show that the second view of the stereoscopic pair can be computed with as little as 15% of the effort of the original image while guaranteeing a similar quality.
Visual Computer, 2005
In this work we describe a decomposition scheme for polyhedra called layer-based decomposition. T... more In this work we describe a decomposition scheme for polyhedra called layer-based decomposition. This decomposition can be computed in a straightforward way for any kind of polyhedron: convex or nonconvex, genus 0 or higher, etc. and presents interesting properties and applications like point-in-polyhedron inclusion test, computation of Boolean operations, or 3D location. Two methods for computing this decomposition and several of its applications are described in detail, including experimental results and comparisons with alternative approaches.
Computational Biology and Chemistry, 2014
Early global measures of genome complexity (power spectra, the analysis of fluctuations in DNA wa... more Early global measures of genome complexity (power spectra, the analysis of fluctuations in DNA walks or compositional segmentation) uncovered a high degree of complexity in eukaryotic genome sequences. The main evolutionary mechanisms leading to increases in genome complexity (i.e. gene duplication and transposon proliferation) can all potentially produce increases in DNA clustering. To quantify such clustering and provide a genome-wide description of the formed clusters, we developed GenomeCluster, an algorithm able to detect clusters of whatever genome element identified by chromosome coordinates. We obtained a detailed description of clusters for ten categories of human genome elements, including functional (genes, exons, introns), regulatory (CpG islands, TFBSs, enhancers), variant (SNPs) and repeat (Alus, LINE1) elements, as well as DNase hypersensitivity sites. For each category, we located their clusters in the human genome, then quantifying cluster length and composition, and estimated the clustering level as the proportion of clustered genome elements. In average, we found a 27% of elements in clusters, although a considerable variation occurs among different categories. Genes form the lowest number of clusters, but these are the longest ones, both in bp and the average number of components, while the shortest clusters are formed by SNPs. Functional and regulatory elements (genes, CpG islands, TFBSs, enhancers) show the highest clustering level, as compared to DNase sites, repeats (Alus, LINE1) or SNPs. Many of the genome elements we analyzed are known to be composed of clusters of low-level entities. In addition, we found here that the clusters generated by GenomeCluster can be in turn clustered into highlevel super-clusters. The observation of 'clusters-within-clusters' parallels the 'domains within domains' phenomenon previously detected through global statistical methods in eukaryotic sequences, and reveals a complex human genome landscape dominated by hierarchical clustering.
Revista del Laboratorio Clínico, 2011
ICM 2000. Proceedings of the 12th International Conference on Microelectronics. (IEEE Cat. No.00EX453), 2000
Partial-product reduction circuits (compressors) are of capital importance in the design of high ... more Partial-product reduction circuits (compressors) are of capital importance in the design of high performance parallel multipliers. This paper proposes compressor designs based on threshold gates which have been implemented as vMOS circuits. A typical block, a (4,2) compressor, is fully developed. Data for a (6,2) compressor are also provided. Results show that such compressors have the best performance in delay
Proceedings. International Conference on Image Processing
... FUNCTION M. A. Roula, A. Bouridane, F. Kurugollu and A. Amira Queen&a... more ... FUNCTION M. A. Roula, A. Bouridane, F. Kurugollu and A. Amira Queen's University of Belfast, School of Computer Science 18 Malone Road BT7 INN Belfast, UK m.roula, a.bouridane, fkurugollu, a.amira @qub.ac.uk ABSTRACT ...
Therapeutics and Clinical Risk Management, 2011
Aggressive T cell lymphomas are a subgroup of lymphomas with a particularly poor prognosis. This ... more Aggressive T cell lymphomas are a subgroup of lymphomas with a particularly poor prognosis. This is especially true for patients with recurrent or refractory disease, who typically have limited response to salvage therapy and extremely poor overall survival. For this reason, there is a strong need to develop potentially active drugs for these malignancies. Pralatrexate is a novel antifolate designed to have high affinity for reduced folate carrier type 1. Preclinical and clinical studies have demonstrated that pralatrexate has significant activity against T cell lymphomas. The dose-limiting toxicity for pralatrexate is mucositis, which can be abrogated with folic acid and vitamin B12 supplementation. Pralatrexate is the first single agent approved for the treatment of patients with relapsed or refractory peripheral T cell lymphoma. This approval was based on an overall objective response rate observed in the pivotal study. The overall response rate was 29%, with a median duration of 10.1 months. This article reviews the biochemistry, preclinical experience, metabolism, and pharmacokinetics of pralatrexate, including the clinical experience with this agent in lymphoma. Future areas of development are now focused on identifying synergistic combinations of pralatrexate with other agents and the evaluation of predictive markers for clinical benefit.
The Visual Computer, 2007
Voxelization of solids, that is the representation of a solid by a set of voxels that approximate... more Voxelization of solids, that is the representation of a solid by a set of voxels that approximates it, is an operation with important applications in fields like solid modeling, physical simulation or volume graphics. Moreover, the new generation of affordable 3D raster displays has renewed the interest on fast voxelization algorithms, as the scan-conversion of a solid is a basic operation on these devices. In this paper a hardware accelerated method for computing a voxelization of a polyhedron is presented. The algorithm is simple, efficient, robust and handles any kind of polyhedron (self-intersecting, with or without holes, manifold or non-manifold). Three different implementations are described in detail. The first is a conventional implementation in the CPU, the second is a hardware accelerated implementation that uses standard OpenGL primitives, and the third exploits the capabilities of modern GPUs by using vertex programs.
The Visual Computer, 2009
This paper presents a scheme for decomposing polyhedra called multi-LREP. The scheme is based on ... more This paper presents a scheme for decomposing polyhedra called multi-LREP. The scheme is based on the L-REP decomposition, which classifies the triangular faces of a polyhedron into a set of layered tetrahedra. In the multi-LREP these layered tetrahedra are grouped into regions of a space subdivision. The paper also describes an efficient method for constructing the L-REP decomposition and how the multi-LREP can be applied to speed up two L-REP applications: the point-in-polyhedron inclusion test and the ray-scene intersection. An experimental comparison with other point-in-polyhedron tests is presented as well.
Robotics and Autonomous Systems, 2010
The Path Planning problem is a common topic for Robotics and Computational Geometry. Many importa... more The Path Planning problem is a common topic for Robotics and Computational Geometry. Many important results have been found to this classic problem, some of them based on plane or space tessellation. The new approach we propose in this paper computes a partition of the plane called the Polar Diagram, using angle properties as criterion of construction. Compared to some other plane partitions as Voronoi Diagrams, this tessellation can be computed much more efficiently for different geometric objects. The polar diagram used as preprocessing can be applied to many geometric problems where the solution can be given by angle processing, such as Visibility or Path Planning problems.