JOAO BAPTISTA SANCHES - Academia.edu (original) (raw)

Papers by JOAO BAPTISTA SANCHES

Research paper thumbnail of Characterization and quantification of cytokines, langerhans cells and factor XIIIa+ dermal dendrocytes in skin lesions from patients with human T-cell lymphotropic virus type 1 (HTLV-1): an immunohistochemical study

Research paper thumbnail of Plano de marketing aplicado ao CDI da Secretaria-Geral do MAMAOT

Este é um momento emotivo e importante nesta fase da minha vida. As palavras correctas, são sempr... more Este é um momento emotivo e importante nesta fase da minha vida. As palavras correctas, são sempre difíceis de encontrar mas, de uma forma simples, quero agradecer a quem sempre esteve presente na realização deste trabalho projecto. Em particular quero agradecer: Ao Professor Doutor Paulo Alberto, meu orientador, pelo apoio e interesse no acompanhamento deste trabalho. Ao meu co-orientador, Mestre Jorge Revez, por ter aceite o desafio de participar nesta dissertação, pela sua partilha de conhecimentos e constante apoio. Sem a sua ajuda e conhecimento, não teria sido realizado. À minha mãe, pelo seu carinho, apoio e compreensão. À minha família em geral, que sempre me apoiou nas minhas decisões. À Chefe de Divisão, Drª Emília Correia, aos colaboradores da Divisão da Documentação e Comunicação da Secretaria-Geral do MAMAOT, pelo apoio e compreensão, disponibilidade e contribuição prestada nos momentos cruciais deste trabalho de projecto. A todos os meus amigos, colegas do mestrado, em especial ao António Carrilho, por aturar todos os meus momentos de desespero, pelas suas palavras motivadoras e pela revisão do texto.

Research paper thumbnail of Determinantes Da Divulgação De Informações Voluntárias Por Meio Da Internet

Contexto, Apr 29, 2014

E ssa pesquisa objetivou averiguar quais são os fatores determinantes que explicam o volume de di... more E ssa pesquisa objetivou averiguar quais são os fatores determinantes que explicam o volume de divulgação de informações voluntárias disseminado pelas empresas na internet. A pesquisa delineou-se como quantitativa, descritiva e documental. A amostra selecionada para a pesquisa foi composta por 128 empresas de capital aberto. Os dados foram coletados entre dezembro de 2008 e dezembro de 2009. Os dados foram analisados utilizando a técnica estatística da análise de regressão linear pelo método dos mínimos quadrados ordinários (MQO) e análise de correlação de Pearson. Os fatores analisados foram o tamanho da empresa, rentabilidade, endividamento, tamanho da firma de auditoria e nível de governança corporativa. Os resultados mostram que os fatores tamanho da empresa, tamanho da firma de auditoria e nível de governança corporativa constituíram-se como determinantes do volume de informações voluntárias disseminado pelas empresas na internet. Os fatores rentabilidade e endividamento não apresentaram significância estatística.

Research paper thumbnail of A cidade e os conjuntos habitacionais de interesse social: reflexões sobre a evolução do perímetro urbano de Cuiabá/MT

Anais Encontros Nacionais Da Anpur, Dec 1, 2013

Em Cuiaba, assim como em muitas outras cidades brasileiras, loteamentos e conjuntoshabitacionais ... more Em Cuiaba, assim como em muitas outras cidades brasileiras, loteamentos e conjuntoshabitacionais de interesse social foram vetores de expansao da area urbana. Pois, a partir daimplantacao destes (atraves de programas de reducao do deficit habitacional), em terrenoslonginquos dos locais de maior concentracao de trabalho, houve o deslocamento de grande parcela da populacao de baixa renda e consequentemente de equipamentos e servicos publicos em areas que extrapolavam os limites do perimetro urbano. Este artigo tem comoobjetivo lancar uma reflexao sobre as consequencias sociais e economicas que a implantacaodestes loteamentos e conjuntos habitacionais de interesse social, em areas fora do perimetrourbano, causa aos cofres publicos e aos cidadaos “beneficiados”. Dentre essas consequencias,destacam-se algumas das dificuldades encontradas pelos moradores no seu cotidiano, comoexclusao social, violencia urbana e deficiencia no transporte publico, por exemplo. Objetivasetambem expor o papel do Plano Diretor como uma das ferramentas de planejamentourbano e fiscalizacao, impostas para eliminar essa pratica de expansao desnecessaria. Toda adiscussao apresentada a seguir teve sua origem na seguinte questao (ou inquietacao): o PlanoDiretor teve efeito real sobre o controle da expansao urbana e condicionou melhores locaispara a implantacao dos loteamentos e conjuntos habitacionais de interesse social? Para darsuporte as conclusoes pautadas nesta questao foi realizada uma revisao bibliografica sobre oseixos tematicos principais: exclusao social e Plano Diretor; em seguida foram analisadas, pormeio de imagens aereas e visitas ‘in loco’, as areas de implantacao de conjuntos habitacionaisde interesse social implantados (pelos governos municipal e estadual para a reducao dodeficit habitacional), antes e depois da implementacao do Plano Diretor de Desenvolvimento Estrategico de Cuiaba – como por exemplo, o Loteamento Pedra 90, implantado em meadosda decada de 90, e outros conjuntos habitacionais executados mais recentemente. Palavras-chave: conjuntos habitacionais de interesse social, exclusao social, expansao urbana, plano diretor de desenvolvimento estrategico.

Research paper thumbnail of Temporal 2D reconstruction of cell nucleus from Fluorescence Confocal Microscopy images with anisotropic filtering

Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2008

Fluorescence Confocal Microscopy(FCM) is nowadays one of the most important tools in biomedicine ... more Fluorescence Confocal Microscopy(FCM) is nowadays one of the most important tools in biomedicine research. In fact, it makes possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules along the time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson multiplicative noise corrupting the FCM images. The observation are organized in a 3D tensor where each plane is one of the images acquired along the time of a cell using the Fluorescence Loss In Photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is done by using an anis...

Research paper thumbnail of 2083667 Online System For Liver Disease Classification In Ultrasound

Ultrasound in Medicine & Biology, 2015

ABSTRACT http://www.sciencedirect.com/science/article/pii/S0301562914009399

Research paper thumbnail of The Papoulis-Gerchberg Algorithm with Unknown Signal Bandwidth

Lecture Notes in Computer Science, 2006

The Papoulis-Gerchberg algorithm has been extensively used to solve the missing data problem in b... more The Papoulis-Gerchberg algorithm has been extensively used to solve the missing data problem in band-limited signals. The interpolation of low-pass signals with this algorithm can be done if the signal bandwidth is known. In practice, the signal bandwidth is unknown and has to be estimated by the user, preventing an automatic application of the Papoulis-Gerchberg algorithm. In this paper, we propose a method to automatically find this parameter, avoiding the need of the user intervention during the reconstruction process. Experimental results are presented to illustrate the performance of the proposed algorithm.

Research paper thumbnail of The vitality of pattern recognition and image analysis

Not many decades ago, Pattern Recognition and Image Analysis (PR&IA) addressed with simple tasks ... more Not many decades ago, Pattern Recognition and Image Analysis (PR&IA) addressed with simple tasks applying shallow models. But things are changing, and quickly. Then, this highly dynamic discipline has been expanding greatly, also helped by the emergence of newer application such as in robotics, biometrics or multimedia systems. Just now, PR&IA tasks run the complete gamut: from preprogramed works to the stimulating challenge of getting computers to learn as they go. At their most formidable, PR&IA tasks require computers to look, interpret and report back. We are at a transition point where PR&AI are suddenly at the forefront. Progress has come about thanks in part to steady advance in the technologies needed to help machines understand visual data, including machine learning and data mining techniques. The papers included in this special issue provide a snapshot of image analysis and pattern recognition research today. They are the very best of the 6th Iberian Conference on Pattern Recognition and Image Analysis (IbPRIA 2013), held on 5-7 June, 2013 in Madeira, Portugal. IbPRIA 2013 attracted 181 papers from 34 different countries. After the reviewing process, 105 papers were accepted for presentation in the conference. A selection of the best scored and presented at the conference was invited to submit to this special issue a substantially extended and revised version of the conference paper and the resulting manuscripts were sent out for full review. The process, including required revisions, was in accordance with the standing editorial policy of Neurocomputing, resulting in the final versions of the ten papers accepted and appearing in this special issue.

Research paper thumbnail of Neural physiological modeling towards a hemodynamic response function for fMRI

2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2007

The BOLD signal provided by the functional MRI medical modality measures the ratio of oxy-to deox... more The BOLD signal provided by the functional MRI medical modality measures the ratio of oxy-to deoxyhaemoglobin at each location inside the brain. The detection of activated regions upon the application of an external stimulus, e.g., visual or auditive, is based on the comparison of the mentioned ratios of a rest condition (pre-stimulus) and of a stimulated condition (post-stimulus). Therefore, an accurate knowledge of the impulse response of the BOLD signal to neural stimulus in a given region is needed to design robust detectors that discriminate, with a high level of confidence activated from non activated regions. Usually, in the literature, the hemodynamic response has been modeled by known functions, e.g., gamma functions, fitting them, or not, to the experimental data. In this paper we present a different approach based on the physiologic behavior of the vascular and neural tissues. Here, a linear model based on reasonable physiological assumptions about oxygen consumption and vasodilatation processes are used to design a linear model from which a transfer function is derived. The estimation of the model parameters is performed by using the minimum square error (MSE) by forcing the adjustment of the stimulus response to the observations. Experimental results using real data have shown that the proposed model successfully explains the observations allowing to achieve small values for the fitting error.

Research paper thumbnail of An unified framework for Bayesian denoising for several medical and biological imaging Modalities

Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, 2007

Multiplicative noise is often present in several medical and biological imaging modalities, such ... more Multiplicative noise is often present in several medical and biological imaging modalities, such as MRI, Ultrasound, PET/SPECT and Fluorescence Microscopy. Noise removal and preserving the details is not a trivial task. Bayesian algorithms have been used to tackle this problem. They succeed to accomplish this task, however they lead to a computational burden as we increase the image dimensionality. Therefore, a significant effort has been made to accomplish this tradeoff, i.e., to develop fast and reliable algorithms to remove noise without distorting relevant clinical information. This paper provides a new unified framework for Bayesian denoising of images corrupted with additive and multiplicative multiplicative noise. This allows to deal with additive white Gaussian and multiplicative noise described by Poisson and Rayleigh distributions respectively. The proposed algorithm is based on the maximum a posteriori (MAP) criterion, and an edge preserving priors are used to avoid the distortion of the relevant image details. The denoising task is performed by an iterative scheme based on Sylvester/Lyapunov equation. This approach allows to use fast and efficient algorithms described in the literature to solve the Sylvester/Lyapunov equation developed in the context of the Control theory. Experimental results with synthetic and real data testify the performance of the proposed technique, and competitive results are achieved when comparing to the of the state-of-the-art methods. 1

Research paper thumbnail of Diffuse liver disease classification from ultrasound surface characterization, clinical and laboratorial data

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2011

In this work liver contour is semi-automatically segmented and quantified in order to help the id... more In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.

Research paper thumbnail of Fluorescence microscopy imaging denoising with log-Euclidean priors and photobleaching compensation

2009 16th IEEE International Conference on Image Processing (ICIP), 2009

Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical ... more Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.

Research paper thumbnail of Image Reconstruction using the Benford Law

2006 International Conference on Image Processing, 2006

... The Benford law, also known as, First Digit Law, was ob-served for the fist time in 1881 by S... more ... The Benford law, also known as, First Digit Law, was ob-served for the fist time in 1881 by Simon Newcomb [6]. Fifty years later the physicist Benford [7] has made exhaustive ex-perimental tests that have confirmed the ... Correspondent author: J. Sanches (jmrs@alfa.ist.utl.pt) ...

Research paper thumbnail of Automatic HyperParameter Estimation in fMRI

Pattern Recognition and Image Analysis, 2011

Maximum a posteriori (MAP) in the scope of the Bayesian framework is a common criterion used in a... more Maximum a posteriori (MAP) in the scope of the Bayesian framework is a common criterion used in a large number of estimation and decision problems. In image reconstruction problems, typically, the image to be estimated is modeled as a Markov Random Fields (MRF) described by a Gibbs distribution. In this case, the Gibbs energy depends on a multiplicative coefficient, called hyperparameter, that is usually manually tuned [13] in a trial and error basis. In this paper we propose an automatic hyperparameter estimation method designed in the scope of functional Magnetic Resonance Imaging (fMRI) to identify activated brain areas based on Blood Oxygen Level Dependent (BOLD) signal. This problem is formulated as classical binary detection problem in a Bayesian framework where the estimation and inference steps are joined together. The prior terms, incorporating the a priori physiological knowledge about the HRF, drift and spatial correlation across the brain (using edge preserving priors), are automatically tuned with the new proposed method. Results on real and synthetic data are presented and compared against the conventional General Linear Model (GLM) approach.

Research paper thumbnail of Compensation of log-compressed images for 3-D ultrasound

Ultrasound in Medicine & Biology, 2003

In this study, a Bayesian approach was used for 3-D reconstruction in the presence of multiplicat... more In this study, a Bayesian approach was used for 3-D reconstruction in the presence of multiplicative noise and nonlinear compression of the ultrasound (US) data. Ultrasound images are often considered as being corrupted by multiplicative noise (speckle). Several statistical models have been developed to represent the US data. However, commercial US equipment performs a nonlinear image compression that reduces the dynamic range of the US signal for visualization purposes. This operation changes the distribution of the image pixels, preventing a straightforward application of the models. In this paper, the nonlinear compression is explicitly modeled and considered in the reconstruction process, where the speckle noise present in the radio frequency (RF) US data is modeled with a Rayleigh distribution. The results obtained by considering the compression of the US data are then compared with those obtained assuming no compression. It is shown that the estimation performed using the nonlinear log-compression model leads to better results than those obtained with the Rayleigh reconstruction method. The proposed algorithm is tested with synthetic and real data and the results are discussed. The results have shown an improvement in the reconstruction results when the compression operation is included in the image formation model, leading to sharper images with enhanced anatomical details.

Research paper thumbnail of A multiscale algorithm for three-dimensional free-hand ultrasound

Ultrasound in Medicine & Biology, 2002

This paper presents a multi-scale algorithm for the reconstruction of Human anatomy from a set of... more This paper presents a multi-scale algorithm for the reconstruction of Human anatomy from a set of ultrasound images. Reconstruction is formulated in a Bayesian framework as an optimization problem with a large number of unknown variables. Human tissues are represented by the interpolation of coefficients associated to the nodes of a 3D cubic grid. The convergence of the Bayesian method is usually slow and initialization dependent. In this paper, a multi scale approach is proposed to increase the convergence rate of the iterative process of volume estimation. A coarse estimate of the volume is first obtained using a cubic grid with small number of nodes initialized with a constant value computed from the observed data. The volume estimate is then recursively improved by refining the grid step. Experimental results are provided to show that multi-scale method achieves faster convergence rates compared with single scale approach. This is the key improvement towards real time implementations. Experimental results of 3D reconstruction of Human anatomy are presented to assess the performance of the algorithm and comparisons with the single scale method are presented.

Research paper thumbnail of A Binary Detection Approach for fMRI

NeuroImage, 2009

The detection of activated brain areas based on BOLD-fMRI data may be formulated as a classical b... more The detection of activated brain areas based on BOLD-fMRI data may be formulated as a classical binary detection problem. An effective algorithm to solve this problem should be able to estimate the Haemodynamic Response Function (HRF), coping with the Additive White Gaussian Noise (AWGN) corrupting the observations, and should also deal with the low frequency drift affecting the BOLD signal baseline. In this work, a unified Bayesian approach is proposed whereby the traditional estimation and inference steps are joined together and the activity cluster detection is obtained simultaneously with HRF estimation and drift removal.

Research paper thumbnail of A Photobleaching/Photoblinking analytical model for LSFCM imaging

Microscopy and Microanalysis, 2013

Laser scanning fluorescence confocal microscope (LSFCM) imaging is an extensively used modality i... more Laser scanning fluorescence confocal microscope (LSFCM) imaging is an extensively used modality in biological research. However, these images present low signal to noise ratio and a time intensity decay effect due to the so called photoblinking/photobleaching (PBPB) phenomenon that corresponds to an intensity fading of a fluorescent probe along the time, as shown in Figure 1. This effect is caused by quantum phenomena associated with the electronic excitation and photochemical reactions among the fluorescent and the surrounding molecules induced by the incident radiation that temporarily or irreversibly destroy their ability to fluoresce. Since illumination is needed to excite and observe the tagging fluorescent proteins in the specimen and all the fluorophores will eventually photobleach upon extended excitation, the acquisition of this type of images becomes a hard task for long exposures.There are in the literature several proposals to model this fading effects and among them the...

Research paper thumbnail of Photobleaching/Photoblinking Differential Equation Model for Fluorescence Microscopy Imaging

Microscopy and Microanalysis, 2013

Fluorescence images present low signal-to-noise ratio (SNR), are corrupted by a type of multiplic... more Fluorescence images present low signal-to-noise ratio (SNR), are corrupted by a type of multiplicative noise with Poisson distribution, and are affected by a time intensity decay due to photoblinking and photobleaching (PBPB) effects. The noise and the PBPB effects together make long-term biological observation very difficult. Here, a theoretical model based on the underlying quantum mechanic physics theory of the observation process associated with this type of image is presented and the common empirical weighted sum of two decaying exponentials is derived from the model. Improvement in the SNR obtained in denoising when the proposed method is used is particularly important in the last images of the sequence where temporal correlation is used to recover information that is sometimes faded and therefore useless from a visual inspection point of view. The proposed PBPB model is included in a Bayesian denoising algorithm previously proposed by the authors. Experiments with synthetic a...

Research paper thumbnail of Convex Total Variation Denoising of Poisson Fluorescence Confocal Images With Anisotropic Filtering

IEEE Transactions on Image Processing, 2011

Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine rese... more Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.

Research paper thumbnail of Characterization and quantification of cytokines, langerhans cells and factor XIIIa+ dermal dendrocytes in skin lesions from patients with human T-cell lymphotropic virus type 1 (HTLV-1): an immunohistochemical study

Research paper thumbnail of Plano de marketing aplicado ao CDI da Secretaria-Geral do MAMAOT

Este é um momento emotivo e importante nesta fase da minha vida. As palavras correctas, são sempr... more Este é um momento emotivo e importante nesta fase da minha vida. As palavras correctas, são sempre difíceis de encontrar mas, de uma forma simples, quero agradecer a quem sempre esteve presente na realização deste trabalho projecto. Em particular quero agradecer: Ao Professor Doutor Paulo Alberto, meu orientador, pelo apoio e interesse no acompanhamento deste trabalho. Ao meu co-orientador, Mestre Jorge Revez, por ter aceite o desafio de participar nesta dissertação, pela sua partilha de conhecimentos e constante apoio. Sem a sua ajuda e conhecimento, não teria sido realizado. À minha mãe, pelo seu carinho, apoio e compreensão. À minha família em geral, que sempre me apoiou nas minhas decisões. À Chefe de Divisão, Drª Emília Correia, aos colaboradores da Divisão da Documentação e Comunicação da Secretaria-Geral do MAMAOT, pelo apoio e compreensão, disponibilidade e contribuição prestada nos momentos cruciais deste trabalho de projecto. A todos os meus amigos, colegas do mestrado, em especial ao António Carrilho, por aturar todos os meus momentos de desespero, pelas suas palavras motivadoras e pela revisão do texto.

Research paper thumbnail of Determinantes Da Divulgação De Informações Voluntárias Por Meio Da Internet

Contexto, Apr 29, 2014

E ssa pesquisa objetivou averiguar quais são os fatores determinantes que explicam o volume de di... more E ssa pesquisa objetivou averiguar quais são os fatores determinantes que explicam o volume de divulgação de informações voluntárias disseminado pelas empresas na internet. A pesquisa delineou-se como quantitativa, descritiva e documental. A amostra selecionada para a pesquisa foi composta por 128 empresas de capital aberto. Os dados foram coletados entre dezembro de 2008 e dezembro de 2009. Os dados foram analisados utilizando a técnica estatística da análise de regressão linear pelo método dos mínimos quadrados ordinários (MQO) e análise de correlação de Pearson. Os fatores analisados foram o tamanho da empresa, rentabilidade, endividamento, tamanho da firma de auditoria e nível de governança corporativa. Os resultados mostram que os fatores tamanho da empresa, tamanho da firma de auditoria e nível de governança corporativa constituíram-se como determinantes do volume de informações voluntárias disseminado pelas empresas na internet. Os fatores rentabilidade e endividamento não apresentaram significância estatística.

Research paper thumbnail of A cidade e os conjuntos habitacionais de interesse social: reflexões sobre a evolução do perímetro urbano de Cuiabá/MT

Anais Encontros Nacionais Da Anpur, Dec 1, 2013

Em Cuiaba, assim como em muitas outras cidades brasileiras, loteamentos e conjuntoshabitacionais ... more Em Cuiaba, assim como em muitas outras cidades brasileiras, loteamentos e conjuntoshabitacionais de interesse social foram vetores de expansao da area urbana. Pois, a partir daimplantacao destes (atraves de programas de reducao do deficit habitacional), em terrenoslonginquos dos locais de maior concentracao de trabalho, houve o deslocamento de grande parcela da populacao de baixa renda e consequentemente de equipamentos e servicos publicos em areas que extrapolavam os limites do perimetro urbano. Este artigo tem comoobjetivo lancar uma reflexao sobre as consequencias sociais e economicas que a implantacaodestes loteamentos e conjuntos habitacionais de interesse social, em areas fora do perimetrourbano, causa aos cofres publicos e aos cidadaos “beneficiados”. Dentre essas consequencias,destacam-se algumas das dificuldades encontradas pelos moradores no seu cotidiano, comoexclusao social, violencia urbana e deficiencia no transporte publico, por exemplo. Objetivasetambem expor o papel do Plano Diretor como uma das ferramentas de planejamentourbano e fiscalizacao, impostas para eliminar essa pratica de expansao desnecessaria. Toda adiscussao apresentada a seguir teve sua origem na seguinte questao (ou inquietacao): o PlanoDiretor teve efeito real sobre o controle da expansao urbana e condicionou melhores locaispara a implantacao dos loteamentos e conjuntos habitacionais de interesse social? Para darsuporte as conclusoes pautadas nesta questao foi realizada uma revisao bibliografica sobre oseixos tematicos principais: exclusao social e Plano Diretor; em seguida foram analisadas, pormeio de imagens aereas e visitas ‘in loco’, as areas de implantacao de conjuntos habitacionaisde interesse social implantados (pelos governos municipal e estadual para a reducao dodeficit habitacional), antes e depois da implementacao do Plano Diretor de Desenvolvimento Estrategico de Cuiaba – como por exemplo, o Loteamento Pedra 90, implantado em meadosda decada de 90, e outros conjuntos habitacionais executados mais recentemente. Palavras-chave: conjuntos habitacionais de interesse social, exclusao social, expansao urbana, plano diretor de desenvolvimento estrategico.

Research paper thumbnail of Temporal 2D reconstruction of cell nucleus from Fluorescence Confocal Microscopy images with anisotropic filtering

Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2008

Fluorescence Confocal Microscopy(FCM) is nowadays one of the most important tools in biomedicine ... more Fluorescence Confocal Microscopy(FCM) is nowadays one of the most important tools in biomedicine research. In fact, it makes possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules along the time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson multiplicative noise corrupting the FCM images. The observation are organized in a 3D tensor where each plane is one of the images acquired along the time of a cell using the Fluorescence Loss In Photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is done by using an anis...

Research paper thumbnail of 2083667 Online System For Liver Disease Classification In Ultrasound

Ultrasound in Medicine & Biology, 2015

ABSTRACT http://www.sciencedirect.com/science/article/pii/S0301562914009399

Research paper thumbnail of The Papoulis-Gerchberg Algorithm with Unknown Signal Bandwidth

Lecture Notes in Computer Science, 2006

The Papoulis-Gerchberg algorithm has been extensively used to solve the missing data problem in b... more The Papoulis-Gerchberg algorithm has been extensively used to solve the missing data problem in band-limited signals. The interpolation of low-pass signals with this algorithm can be done if the signal bandwidth is known. In practice, the signal bandwidth is unknown and has to be estimated by the user, preventing an automatic application of the Papoulis-Gerchberg algorithm. In this paper, we propose a method to automatically find this parameter, avoiding the need of the user intervention during the reconstruction process. Experimental results are presented to illustrate the performance of the proposed algorithm.

Research paper thumbnail of The vitality of pattern recognition and image analysis

Not many decades ago, Pattern Recognition and Image Analysis (PR&IA) addressed with simple tasks ... more Not many decades ago, Pattern Recognition and Image Analysis (PR&IA) addressed with simple tasks applying shallow models. But things are changing, and quickly. Then, this highly dynamic discipline has been expanding greatly, also helped by the emergence of newer application such as in robotics, biometrics or multimedia systems. Just now, PR&IA tasks run the complete gamut: from preprogramed works to the stimulating challenge of getting computers to learn as they go. At their most formidable, PR&IA tasks require computers to look, interpret and report back. We are at a transition point where PR&AI are suddenly at the forefront. Progress has come about thanks in part to steady advance in the technologies needed to help machines understand visual data, including machine learning and data mining techniques. The papers included in this special issue provide a snapshot of image analysis and pattern recognition research today. They are the very best of the 6th Iberian Conference on Pattern Recognition and Image Analysis (IbPRIA 2013), held on 5-7 June, 2013 in Madeira, Portugal. IbPRIA 2013 attracted 181 papers from 34 different countries. After the reviewing process, 105 papers were accepted for presentation in the conference. A selection of the best scored and presented at the conference was invited to submit to this special issue a substantially extended and revised version of the conference paper and the resulting manuscripts were sent out for full review. The process, including required revisions, was in accordance with the standing editorial policy of Neurocomputing, resulting in the final versions of the ten papers accepted and appearing in this special issue.

Research paper thumbnail of Neural physiological modeling towards a hemodynamic response function for fMRI

2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2007

The BOLD signal provided by the functional MRI medical modality measures the ratio of oxy-to deox... more The BOLD signal provided by the functional MRI medical modality measures the ratio of oxy-to deoxyhaemoglobin at each location inside the brain. The detection of activated regions upon the application of an external stimulus, e.g., visual or auditive, is based on the comparison of the mentioned ratios of a rest condition (pre-stimulus) and of a stimulated condition (post-stimulus). Therefore, an accurate knowledge of the impulse response of the BOLD signal to neural stimulus in a given region is needed to design robust detectors that discriminate, with a high level of confidence activated from non activated regions. Usually, in the literature, the hemodynamic response has been modeled by known functions, e.g., gamma functions, fitting them, or not, to the experimental data. In this paper we present a different approach based on the physiologic behavior of the vascular and neural tissues. Here, a linear model based on reasonable physiological assumptions about oxygen consumption and vasodilatation processes are used to design a linear model from which a transfer function is derived. The estimation of the model parameters is performed by using the minimum square error (MSE) by forcing the adjustment of the stimulus response to the observations. Experimental results using real data have shown that the proposed model successfully explains the observations allowing to achieve small values for the fitting error.

Research paper thumbnail of An unified framework for Bayesian denoising for several medical and biological imaging Modalities

Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, 2007

Multiplicative noise is often present in several medical and biological imaging modalities, such ... more Multiplicative noise is often present in several medical and biological imaging modalities, such as MRI, Ultrasound, PET/SPECT and Fluorescence Microscopy. Noise removal and preserving the details is not a trivial task. Bayesian algorithms have been used to tackle this problem. They succeed to accomplish this task, however they lead to a computational burden as we increase the image dimensionality. Therefore, a significant effort has been made to accomplish this tradeoff, i.e., to develop fast and reliable algorithms to remove noise without distorting relevant clinical information. This paper provides a new unified framework for Bayesian denoising of images corrupted with additive and multiplicative multiplicative noise. This allows to deal with additive white Gaussian and multiplicative noise described by Poisson and Rayleigh distributions respectively. The proposed algorithm is based on the maximum a posteriori (MAP) criterion, and an edge preserving priors are used to avoid the distortion of the relevant image details. The denoising task is performed by an iterative scheme based on Sylvester/Lyapunov equation. This approach allows to use fast and efficient algorithms described in the literature to solve the Sylvester/Lyapunov equation developed in the context of the Control theory. Experimental results with synthetic and real data testify the performance of the proposed technique, and competitive results are achieved when comparing to the of the state-of-the-art methods. 1

Research paper thumbnail of Diffuse liver disease classification from ultrasound surface characterization, clinical and laboratorial data

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2011

In this work liver contour is semi-automatically segmented and quantified in order to help the id... more In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.

Research paper thumbnail of Fluorescence microscopy imaging denoising with log-Euclidean priors and photobleaching compensation

2009 16th IEEE International Conference on Image Processing (ICIP), 2009

Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical ... more Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.

Research paper thumbnail of Image Reconstruction using the Benford Law

2006 International Conference on Image Processing, 2006

... The Benford law, also known as, First Digit Law, was ob-served for the fist time in 1881 by S... more ... The Benford law, also known as, First Digit Law, was ob-served for the fist time in 1881 by Simon Newcomb [6]. Fifty years later the physicist Benford [7] has made exhaustive ex-perimental tests that have confirmed the ... Correspondent author: J. Sanches (jmrs@alfa.ist.utl.pt) ...

Research paper thumbnail of Automatic HyperParameter Estimation in fMRI

Pattern Recognition and Image Analysis, 2011

Maximum a posteriori (MAP) in the scope of the Bayesian framework is a common criterion used in a... more Maximum a posteriori (MAP) in the scope of the Bayesian framework is a common criterion used in a large number of estimation and decision problems. In image reconstruction problems, typically, the image to be estimated is modeled as a Markov Random Fields (MRF) described by a Gibbs distribution. In this case, the Gibbs energy depends on a multiplicative coefficient, called hyperparameter, that is usually manually tuned [13] in a trial and error basis. In this paper we propose an automatic hyperparameter estimation method designed in the scope of functional Magnetic Resonance Imaging (fMRI) to identify activated brain areas based on Blood Oxygen Level Dependent (BOLD) signal. This problem is formulated as classical binary detection problem in a Bayesian framework where the estimation and inference steps are joined together. The prior terms, incorporating the a priori physiological knowledge about the HRF, drift and spatial correlation across the brain (using edge preserving priors), are automatically tuned with the new proposed method. Results on real and synthetic data are presented and compared against the conventional General Linear Model (GLM) approach.

Research paper thumbnail of Compensation of log-compressed images for 3-D ultrasound

Ultrasound in Medicine & Biology, 2003

In this study, a Bayesian approach was used for 3-D reconstruction in the presence of multiplicat... more In this study, a Bayesian approach was used for 3-D reconstruction in the presence of multiplicative noise and nonlinear compression of the ultrasound (US) data. Ultrasound images are often considered as being corrupted by multiplicative noise (speckle). Several statistical models have been developed to represent the US data. However, commercial US equipment performs a nonlinear image compression that reduces the dynamic range of the US signal for visualization purposes. This operation changes the distribution of the image pixels, preventing a straightforward application of the models. In this paper, the nonlinear compression is explicitly modeled and considered in the reconstruction process, where the speckle noise present in the radio frequency (RF) US data is modeled with a Rayleigh distribution. The results obtained by considering the compression of the US data are then compared with those obtained assuming no compression. It is shown that the estimation performed using the nonlinear log-compression model leads to better results than those obtained with the Rayleigh reconstruction method. The proposed algorithm is tested with synthetic and real data and the results are discussed. The results have shown an improvement in the reconstruction results when the compression operation is included in the image formation model, leading to sharper images with enhanced anatomical details.

Research paper thumbnail of A multiscale algorithm for three-dimensional free-hand ultrasound

Ultrasound in Medicine & Biology, 2002

This paper presents a multi-scale algorithm for the reconstruction of Human anatomy from a set of... more This paper presents a multi-scale algorithm for the reconstruction of Human anatomy from a set of ultrasound images. Reconstruction is formulated in a Bayesian framework as an optimization problem with a large number of unknown variables. Human tissues are represented by the interpolation of coefficients associated to the nodes of a 3D cubic grid. The convergence of the Bayesian method is usually slow and initialization dependent. In this paper, a multi scale approach is proposed to increase the convergence rate of the iterative process of volume estimation. A coarse estimate of the volume is first obtained using a cubic grid with small number of nodes initialized with a constant value computed from the observed data. The volume estimate is then recursively improved by refining the grid step. Experimental results are provided to show that multi-scale method achieves faster convergence rates compared with single scale approach. This is the key improvement towards real time implementations. Experimental results of 3D reconstruction of Human anatomy are presented to assess the performance of the algorithm and comparisons with the single scale method are presented.

Research paper thumbnail of A Binary Detection Approach for fMRI

NeuroImage, 2009

The detection of activated brain areas based on BOLD-fMRI data may be formulated as a classical b... more The detection of activated brain areas based on BOLD-fMRI data may be formulated as a classical binary detection problem. An effective algorithm to solve this problem should be able to estimate the Haemodynamic Response Function (HRF), coping with the Additive White Gaussian Noise (AWGN) corrupting the observations, and should also deal with the low frequency drift affecting the BOLD signal baseline. In this work, a unified Bayesian approach is proposed whereby the traditional estimation and inference steps are joined together and the activity cluster detection is obtained simultaneously with HRF estimation and drift removal.

Research paper thumbnail of A Photobleaching/Photoblinking analytical model for LSFCM imaging

Microscopy and Microanalysis, 2013

Laser scanning fluorescence confocal microscope (LSFCM) imaging is an extensively used modality i... more Laser scanning fluorescence confocal microscope (LSFCM) imaging is an extensively used modality in biological research. However, these images present low signal to noise ratio and a time intensity decay effect due to the so called photoblinking/photobleaching (PBPB) phenomenon that corresponds to an intensity fading of a fluorescent probe along the time, as shown in Figure 1. This effect is caused by quantum phenomena associated with the electronic excitation and photochemical reactions among the fluorescent and the surrounding molecules induced by the incident radiation that temporarily or irreversibly destroy their ability to fluoresce. Since illumination is needed to excite and observe the tagging fluorescent proteins in the specimen and all the fluorophores will eventually photobleach upon extended excitation, the acquisition of this type of images becomes a hard task for long exposures.There are in the literature several proposals to model this fading effects and among them the...

Research paper thumbnail of Photobleaching/Photoblinking Differential Equation Model for Fluorescence Microscopy Imaging

Microscopy and Microanalysis, 2013

Fluorescence images present low signal-to-noise ratio (SNR), are corrupted by a type of multiplic... more Fluorescence images present low signal-to-noise ratio (SNR), are corrupted by a type of multiplicative noise with Poisson distribution, and are affected by a time intensity decay due to photoblinking and photobleaching (PBPB) effects. The noise and the PBPB effects together make long-term biological observation very difficult. Here, a theoretical model based on the underlying quantum mechanic physics theory of the observation process associated with this type of image is presented and the common empirical weighted sum of two decaying exponentials is derived from the model. Improvement in the SNR obtained in denoising when the proposed method is used is particularly important in the last images of the sequence where temporal correlation is used to recover information that is sometimes faded and therefore useless from a visual inspection point of view. The proposed PBPB model is included in a Bayesian denoising algorithm previously proposed by the authors. Experiments with synthetic a...

Research paper thumbnail of Convex Total Variation Denoising of Poisson Fluorescence Confocal Images With Anisotropic Filtering

IEEE Transactions on Image Processing, 2011

Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine rese... more Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.