Mario Jino - Academia.edu (original) (raw)
Papers by Mario Jino
Anais do X Simpósio Brasileiro de Segurança da Informação e de Sistemas Computacionais (SBSeg 2010)
Código malicioso (malware) disseminado através da Internet—vírus, worms, trojans—é a maior ameaça... more Código malicioso (malware) disseminado através da Internet—vírus, worms, trojans—é a maior ameaça atual à segurança da informação e um negócio lucrativo para criminosos. Há abordagens para analisar malware que monitoram suas ações durante a execução em ambiente controlado, permitindo identificar comportamentos maliciosos. Neste artigo, propõe-se uma ferramenta de análise comportamental de malware não intrusiva, a qual amplia a análise a exemplares que contornam as abordagens atuais e corrige alguns problemas presentes nestas, preenchendo assim uma lacuna na área.
Anais do XIII Simpósio Brasileiro de Segurança da Informação e de Sistemas Computacionais (SBSeg 2013)
Nowadays, software products are developed with security vulnerabilities due to bad coding. Vulner... more Nowadays, software products are developed with security vulnerabilities due to bad coding. Vulnerability scanner tools automatically detect security vulnerabilities in web applications; thus, trustworthiness on the results of these tools is essential and, sometimes, the evaluation of their results is done manually or even empirically. This work presents a semi automated approach, based on fault injection techniques, to assess the efficacy of these tools. Three scanner tools were assessed with the presence of realistic software faults responsible for security vulnerabilities in web applications. Results show that the approach is effective and has the advantage of predicting security vulnerabilities through the fault injection techniques.
Este trabalho apresenta as principais caracteristicas de uma ferramenta de apoio ao teste de regr... more Este trabalho apresenta as principais caracteristicas de uma ferramenta de apoio ao teste de regressao estrutural baseado na familia de Criterios Potenciais-Usos e uma estrategia para sua aplicacao. A ferramenta, denominada RePoKe- Tool (Regression Testing support for Potencial-Uses Criteria Tool), pode ser aplicada em unidades (e.g., funcoes, procedimentos) que foram testadas atraves da ferramenta de teste POKE-TOOL e sofreram manutencao corretiva (teste de regressao corretivo) ou entao adaptativa ou perfectiva (teste de regressao progressivo), utiliza estrategia seletiva para identificar apenas os elementos requeridos para o teste estrutural (caminhos e associacoes) que foram inseridos ou modificados apos essa manutencao e sugere um subconjunto dos casos de teste originais que, potencialmente, cobre esses elementos. Alem disso, configura e atualiza os arquivos necessarios para que a POKE-TOOL possa realizar o teste de regressao. o trabalho apresenta tambem estudos de casos que mostram a viabilidade da aplicacao de uma estrategia de selecao de casos de testes de regressao baseada em conceitos de teste funcional, com objetivo de atingir um bom indice de cobertura de elementos requeridos selecionados para o teste de regressao estrutural. Os resultados obtidos atraves desses estudos de casos contribuiram para a definicao de um Guia de Referencia para Programadores de Manutencao, cujo objetivo e diminuir os altos custos envolvidos com o Teste de Regressao, sem relevar para segundo plano o principal objetivo de qualquer atividade de teste: a revelacao de defeitos no software Abstract
Abstract A tool and techniques are presented for test data generation and infeasibility identific... more Abstract A tool and techniques are presented for test data generation and infeasibility identification in structural software testing technique. The tool is based on: the Dynamic Technique; search using Genetic Algorithms; and reuse of solutions through Case-Based ...
Software: Practice and Experience, 2017
Nowadays, personal information is collected, stored, and managed through web applications and ser... more Nowadays, personal information is collected, stored, and managed through web applications and services. Companies are interested in keeping such information private due to regulation laws and privacy concerns of customers. Also, the reputation of a company can be dependent on privacy protection, i.e., the more a company protects the privacy of its customers the more credibility it gets. This paper proposes an integrated approach which relies on models and design tools to help the analysis, design and development of web applications and services with privacy concerns. Using the approach, these applications can be developed consistently with their privacy policies in order to enforce them, protecting personal information from different sources of privacy violation. The approach is composed of a conceptual model, a reference architecture, and a UML Profile, i.e., an extension of the UML for including privacy protection. The idea is to systematize the privacy concepts in the scope of web applications and services, organizing the privacy domain knowledge and providing features and functionalities that must be addressed to protect the privacy of the users in the design and development of web applications. Validation has been performed by analyzing the ability of the approach to model privacy policies from real web applications, and by applying it to a simple application example of an online bookstore. Results show that privacy protection can be implemented in a model-based approach, bringing values for the stakeholders and being an important contribution towards improving the process of designing web applications in the privacy domain.
16th International Conference on Information Technology-New Generations (ITNG 2019), 2019
Internet has become the main communication instrument between criminals. Expressions used by crim... more Internet has become the main communication instrument between criminals. Expressions used by criminals are ciphered, by replacing language terms with regionalized and mutant expressions. There is a need to reveal, understand and formalize these obscure dialects to enable the automation of searches and the analysis of intentions. OntoCexp (Ontology of Criminal Expressions) aims at providing a common and extensible model for identifying usage of crime expressions in Internet. Its foundations come from an initial terminology and a semantic analysis of written communication between criminals (from Twitter) in Brazil (Portuguese language). 17 papers on ontologies, out of 63 articles of interest, have been selected and used as input to our proposal. The initial version of OntoCexp and its core elements are presented here; the complete ontology (OWL file) is available publicly to be used. We expect it to be useful for cyber-security researchers and criminal investigators who wish to formal...
Ibero-American Symposium on Software Engineering and Knowledge Engineering, 2008
Software testing must be well documented and based on up to date and consistent information to av... more Software testing must be well documented and based on up to date and consistent information to avoid troubles such as incomplete or inappropriate tests. This involves the definition of what must be recorded, beyond tracing the diverse links among the many information elements that compose the documents. Documenting and tracing involve the manipulation of a very huge amount of data, demanding automated support. This paper presents PROMETEU, a tool to support documents generation and traceability among artifacts that comprises the test process documents.
2008 The Third International Conference on Software Engineering Advances, 2008
... Abstract The relevance of reactive capabilities as a unifying paradigm for handling a number ... more ... Abstract The relevance of reactive capabilities as a unifying paradigm for handling a number of database features and applications is well-established. ... The set of manipulation commands in Rx that can cause persistent data flow of interaction as-sociations was targeted ...
IEEE Transactions on Computers, 1979
We are concerned here with the design of intelligent magnetic bubble memories which may be used t... more We are concerned here with the design of intelligent magnetic bubble memories which may be used to support high-level data base management functions. They may also be used to provide users with large work spaces in which elementary file processing operations may be performed without external intervention. It is our intent to explore ways of incorporating in the design of these memories the novel chip organizations and unique features of magnetic bubble memories. In particular, we evaluate the performance of various memory organizations and storage organizations for different magnetic bubble chips. Retrieval times per word and per page are the parameters used to evaluate the different memory organizations. Performance of hierarchical bubble memory systems are discussed. The data rearrangement operations capable of being carried out in bubble memories are incorporated in the design of elementary file processing operations and basic relational algebraic operations. An architecture of intelligent magnetic bubble memory designed to support the relational data model and to enhance a relational algebraic interface is described. Index Terms-Database machines, magnetic bubble memories, relational database interface. base computer (DBC) [7] is a specialized back-end computer capable of handling very large data bases (109 bytes or more) and supporting several data models. In addition, it also provides security and clustering mechanisms making Manuscript received June 25, 1979. This work was supported in part by the National Science Foundation under Grant MCS-77-27910.
XIV Simpósio Brasileiro …, 2000
Resumo Neste texto são apresentados alguns critérios de teste de software, conceitos pertinentes ... more Resumo Neste texto são apresentados alguns critérios de teste de software, conceitos pertinentes e ferramentas de apoio. São abordados critérios de teste funcional, estrutural baseados em fluxo de controle e em fluxo de dados e baseados em mutação. Ênfase é dada no teste ...
A literature survey on ontologies concerning the Security Assessment domain has been carried out ... more A literature survey on ontologies concerning the Security Assessment domain has been carried out to uncover initiatives that aim at formalizing concepts from the Security Assessment field of research. A preliminary analysis and a discussion on the selected works are presented. Our main contribution is an updated literature review, describing key characteristics, results, research issues, and application domains of the papers. We have also detected gaps in the Security Assessment literature that could be the subject of further studies in the field. This work is meant to be useful for security researchers who wish to adopt a formal approach in their methods.
This paper describes the deployment of a software testing process at NeoIT, a small software deve... more This paper describes the deployment of a software testing process at NeoIT, a small software development company. The deployment was carried out by the instantiation of a generic testing process, taking into account the NeoIT particularities and needs. The deployed testing process, which was used in two development projects, provided a substantial improvement in the quality of products developed by the company. Resumo. Esse artigo descreve a implantação de um processo de teste na NeoIT, uma empresa de desenvolvimento de software de pequeno porte. A implantação foi feita pela instanciação de um processo genérico de teste, realizada considerando-se as necessidades e particularidades da NeoIT. O processo de teste implantado, utilizado em dois projetos de desenvolvimento, levou a uma substancial melhora da qualidade dos produtos desenvolvidos pela empresa.
2021 IEEE/ACS 18th International Conference on Computer Systems and Applications (AICCSA)
IEEE Data Engineering Bulletin, 1978
Data Base Engineering Bulletin is a quarterly publication of the IEEE Computer Society Technical ... more Data Base Engineering Bulletin is a quarterly publication of the IEEE Computer Society Technical Committee on Data Base Engineering. Its scope of interest includes: data structures and models, access strategies, access control techni ques, data base architecture, data base machines, intelligent front ends, mass storage for very large data bases, distributed data base problems and techniques, data base software design and implementation, data base utilities, etc. Contribution to the Bulletin is hereby solicited. News items, letters, techni cal papers, book reviews, meeting previews and summaries, etc., should be sent to the Editor. All letters to the Editor will be considered for publication unless accompanied by a request to the contrary. Technical papers are unrefereed. Opinions expressed in contributions are those of the individual author rather than the official position of the TC on Data Base Engineering, the IEEE Computer Society, or organizations with which the author may be affiliated. Membership in Data Base Engineering Technical Committee is open to IEEE Computer Society members, student members, and associate members.
a b s t r a c t We present a new test data generation technique which uses the concept of diversi... more a b s t r a c t We present a new test data generation technique which uses the concept of diversity of test sets as a basis for the diversity oriented test data generation -DOTG. Using DOTG we translate into an automatic test data generation technique the intuitive belief that increasing the variety, or diversity, of the test data used to test a program can lead to an improvement on the completeness, or quality, of the testing performed. We define the input domain perspective for diversity (DOTG-ID), which considers the distances among the test data in the program input domain to compute a diversity value for test sets. We describe metaheuristics which can be used to automate the generation of test sets for the DOTG-ID testing technique: simulated annealing; a genetic algorithm; and a proposed metaheuristic named simulated repulsion. The effectiveness of DOTG-ID was evaluated by using a Monte Carlo simulation, and also by applying the technique to test simple programs and measuring ...
Advances in Intelligent Systems and Computing
Virtual Local Area Network (VLAN) is a technology capable of separating networks into specific do... more Virtual Local Area Network (VLAN) is a technology capable of separating networks into specific domains. Attacks on VLANs could affect computing environments causing service interruptions. These attacks exploit vulnerabilities and operating characteristics of VLANs to gain access to critical information. Conceptual modeling of vulnerabilities and attacks related to VLANs is crucial to enable the construction of systematic methods and techniques for protecting critical infrastructures. Ontologies can contribute in this context, as they are modeling tools that enable the formalization of the main concepts and their relationships, in addition to enabling the creation of semantic rules that can be used by intelligent systems. We present a quasi-systematic literature review aiming at describing and classifying studies on ontologies of vulnerabilities and attacks on VLANs. The approach used in this review allowed for the verification and analysis of trends, as well as it uncovers the technological approaches adopted over the past 10 years. The main contributions of this review are: i) a description of the most recent ontologies, taxonomies, techniques and theories, in addition to the contributions and limitations of proposals in the literature; and ii) the identification of gaps in the literature and research challenges. Searches were carried
Anais do XI Simpósio Brasileiro de Qualidade de Software (SBQS 2012)
MPS tem sido utilizada com sucesso nas organizações intensivas em software geralmente com base em... more MPS tem sido utilizada com sucesso nas organizações intensivas em software geralmente com base em um único modelo de capacidade ou de maturidade da capacidade, tal como o CMMI-DEV ou MR MPS.BR. Porém tem crescido a necessidade de utilização de múltiplos modelos de referência para melhoria de processo. As soluções atuais envolvem geralmente a criação prévia de um novo modelo que integra um determinado conjunto de modelos. Este artigo apresenta um Metamodelo de Perfis de Capacidade de Processo, que utiliza os conceitos da MDE para a integração dinâmica de elementos de múltiplos modelos. Apresenta também a racionalidade, especificação, estratégia de implementação e um exemplo de uso do metamodelo.
ArXiv, 2017
The use of ontologies and taxonomies contributes by providing means to define concepts, minimize ... more The use of ontologies and taxonomies contributes by providing means to define concepts, minimize the ambiguity, improve the interoperability and manage knowledge of the security domain. Thus, this paper presents a literature survey on ontologies and taxonomies concerning the Security Assessment domain. We carried out it to uncover initiatives that aim at formalizing concepts from the Information Security and Test and Assessment fields of research. We applied a systematic review approach in seven scientific databases. 138 papers were identified and divided into categories according to their main contributions, namely: Ontology, Taxonomy and Survey. Based on their contents, we selected 47 papers on ontologies, 22 papers on taxonomies, and 11 papers on surveys. A taxonomy has been devised to be used in the evaluation of the papers. Summaries, tables, and a preliminary analysis of the selected works are presented. Our main contributions are: 1) an updated literature review, describing k...
O crescimento acentuado da Internet e a consequente demanda por novos servicos levam ao desenvolv... more O crescimento acentuado da Internet e a consequente demanda por novos servicos levam ao desenvolvimento de aplicacoes cada vez mais complexas. Devido a esta complexidade e as pressoes para o cumprimento de cronogramas cada vez mais restritivos, muitas vezes a seguranca desses sistemas acaba sendo negligenciada, originando aplicacoes com mais vulnerabilidades e portanto sujeitas a um maior numero de ataques. Um dos ataques mais frequentes e o do tipo buffer overflow. Permite que um atacante insira codigos maliciosos em um programa, alterando seu fluxo de controle original, com o objetivo de conseguir acesso a um sistema ou aumentar seus privilegios. Neste trabalho foi utilizada a tecnica denominada Teste de Vulnerabilidade de Seguranca proposta para detectar vulnerabilidade do tipo buffer overflow no software por meio da aplicacao do Teste de Mutacao. Uma ferramenta chamada SEVMUT - Security Vulnerabilities Mutation Tool foi desenvolvida e sua aplicabilidade, escalabilidade e eficacia foram validadas. Abstract
Anais do X Simpósio Brasileiro de Segurança da Informação e de Sistemas Computacionais (SBSeg 2010)
Código malicioso (malware) disseminado através da Internet—vírus, worms, trojans—é a maior ameaça... more Código malicioso (malware) disseminado através da Internet—vírus, worms, trojans—é a maior ameaça atual à segurança da informação e um negócio lucrativo para criminosos. Há abordagens para analisar malware que monitoram suas ações durante a execução em ambiente controlado, permitindo identificar comportamentos maliciosos. Neste artigo, propõe-se uma ferramenta de análise comportamental de malware não intrusiva, a qual amplia a análise a exemplares que contornam as abordagens atuais e corrige alguns problemas presentes nestas, preenchendo assim uma lacuna na área.
Anais do XIII Simpósio Brasileiro de Segurança da Informação e de Sistemas Computacionais (SBSeg 2013)
Nowadays, software products are developed with security vulnerabilities due to bad coding. Vulner... more Nowadays, software products are developed with security vulnerabilities due to bad coding. Vulnerability scanner tools automatically detect security vulnerabilities in web applications; thus, trustworthiness on the results of these tools is essential and, sometimes, the evaluation of their results is done manually or even empirically. This work presents a semi automated approach, based on fault injection techniques, to assess the efficacy of these tools. Three scanner tools were assessed with the presence of realistic software faults responsible for security vulnerabilities in web applications. Results show that the approach is effective and has the advantage of predicting security vulnerabilities through the fault injection techniques.
Este trabalho apresenta as principais caracteristicas de uma ferramenta de apoio ao teste de regr... more Este trabalho apresenta as principais caracteristicas de uma ferramenta de apoio ao teste de regressao estrutural baseado na familia de Criterios Potenciais-Usos e uma estrategia para sua aplicacao. A ferramenta, denominada RePoKe- Tool (Regression Testing support for Potencial-Uses Criteria Tool), pode ser aplicada em unidades (e.g., funcoes, procedimentos) que foram testadas atraves da ferramenta de teste POKE-TOOL e sofreram manutencao corretiva (teste de regressao corretivo) ou entao adaptativa ou perfectiva (teste de regressao progressivo), utiliza estrategia seletiva para identificar apenas os elementos requeridos para o teste estrutural (caminhos e associacoes) que foram inseridos ou modificados apos essa manutencao e sugere um subconjunto dos casos de teste originais que, potencialmente, cobre esses elementos. Alem disso, configura e atualiza os arquivos necessarios para que a POKE-TOOL possa realizar o teste de regressao. o trabalho apresenta tambem estudos de casos que mostram a viabilidade da aplicacao de uma estrategia de selecao de casos de testes de regressao baseada em conceitos de teste funcional, com objetivo de atingir um bom indice de cobertura de elementos requeridos selecionados para o teste de regressao estrutural. Os resultados obtidos atraves desses estudos de casos contribuiram para a definicao de um Guia de Referencia para Programadores de Manutencao, cujo objetivo e diminuir os altos custos envolvidos com o Teste de Regressao, sem relevar para segundo plano o principal objetivo de qualquer atividade de teste: a revelacao de defeitos no software Abstract
Abstract A tool and techniques are presented for test data generation and infeasibility identific... more Abstract A tool and techniques are presented for test data generation and infeasibility identification in structural software testing technique. The tool is based on: the Dynamic Technique; search using Genetic Algorithms; and reuse of solutions through Case-Based ...
Software: Practice and Experience, 2017
Nowadays, personal information is collected, stored, and managed through web applications and ser... more Nowadays, personal information is collected, stored, and managed through web applications and services. Companies are interested in keeping such information private due to regulation laws and privacy concerns of customers. Also, the reputation of a company can be dependent on privacy protection, i.e., the more a company protects the privacy of its customers the more credibility it gets. This paper proposes an integrated approach which relies on models and design tools to help the analysis, design and development of web applications and services with privacy concerns. Using the approach, these applications can be developed consistently with their privacy policies in order to enforce them, protecting personal information from different sources of privacy violation. The approach is composed of a conceptual model, a reference architecture, and a UML Profile, i.e., an extension of the UML for including privacy protection. The idea is to systematize the privacy concepts in the scope of web applications and services, organizing the privacy domain knowledge and providing features and functionalities that must be addressed to protect the privacy of the users in the design and development of web applications. Validation has been performed by analyzing the ability of the approach to model privacy policies from real web applications, and by applying it to a simple application example of an online bookstore. Results show that privacy protection can be implemented in a model-based approach, bringing values for the stakeholders and being an important contribution towards improving the process of designing web applications in the privacy domain.
16th International Conference on Information Technology-New Generations (ITNG 2019), 2019
Internet has become the main communication instrument between criminals. Expressions used by crim... more Internet has become the main communication instrument between criminals. Expressions used by criminals are ciphered, by replacing language terms with regionalized and mutant expressions. There is a need to reveal, understand and formalize these obscure dialects to enable the automation of searches and the analysis of intentions. OntoCexp (Ontology of Criminal Expressions) aims at providing a common and extensible model for identifying usage of crime expressions in Internet. Its foundations come from an initial terminology and a semantic analysis of written communication between criminals (from Twitter) in Brazil (Portuguese language). 17 papers on ontologies, out of 63 articles of interest, have been selected and used as input to our proposal. The initial version of OntoCexp and its core elements are presented here; the complete ontology (OWL file) is available publicly to be used. We expect it to be useful for cyber-security researchers and criminal investigators who wish to formal...
Ibero-American Symposium on Software Engineering and Knowledge Engineering, 2008
Software testing must be well documented and based on up to date and consistent information to av... more Software testing must be well documented and based on up to date and consistent information to avoid troubles such as incomplete or inappropriate tests. This involves the definition of what must be recorded, beyond tracing the diverse links among the many information elements that compose the documents. Documenting and tracing involve the manipulation of a very huge amount of data, demanding automated support. This paper presents PROMETEU, a tool to support documents generation and traceability among artifacts that comprises the test process documents.
2008 The Third International Conference on Software Engineering Advances, 2008
... Abstract The relevance of reactive capabilities as a unifying paradigm for handling a number ... more ... Abstract The relevance of reactive capabilities as a unifying paradigm for handling a number of database features and applications is well-established. ... The set of manipulation commands in Rx that can cause persistent data flow of interaction as-sociations was targeted ...
IEEE Transactions on Computers, 1979
We are concerned here with the design of intelligent magnetic bubble memories which may be used t... more We are concerned here with the design of intelligent magnetic bubble memories which may be used to support high-level data base management functions. They may also be used to provide users with large work spaces in which elementary file processing operations may be performed without external intervention. It is our intent to explore ways of incorporating in the design of these memories the novel chip organizations and unique features of magnetic bubble memories. In particular, we evaluate the performance of various memory organizations and storage organizations for different magnetic bubble chips. Retrieval times per word and per page are the parameters used to evaluate the different memory organizations. Performance of hierarchical bubble memory systems are discussed. The data rearrangement operations capable of being carried out in bubble memories are incorporated in the design of elementary file processing operations and basic relational algebraic operations. An architecture of intelligent magnetic bubble memory designed to support the relational data model and to enhance a relational algebraic interface is described. Index Terms-Database machines, magnetic bubble memories, relational database interface. base computer (DBC) [7] is a specialized back-end computer capable of handling very large data bases (109 bytes or more) and supporting several data models. In addition, it also provides security and clustering mechanisms making Manuscript received June 25, 1979. This work was supported in part by the National Science Foundation under Grant MCS-77-27910.
XIV Simpósio Brasileiro …, 2000
Resumo Neste texto são apresentados alguns critérios de teste de software, conceitos pertinentes ... more Resumo Neste texto são apresentados alguns critérios de teste de software, conceitos pertinentes e ferramentas de apoio. São abordados critérios de teste funcional, estrutural baseados em fluxo de controle e em fluxo de dados e baseados em mutação. Ênfase é dada no teste ...
A literature survey on ontologies concerning the Security Assessment domain has been carried out ... more A literature survey on ontologies concerning the Security Assessment domain has been carried out to uncover initiatives that aim at formalizing concepts from the Security Assessment field of research. A preliminary analysis and a discussion on the selected works are presented. Our main contribution is an updated literature review, describing key characteristics, results, research issues, and application domains of the papers. We have also detected gaps in the Security Assessment literature that could be the subject of further studies in the field. This work is meant to be useful for security researchers who wish to adopt a formal approach in their methods.
This paper describes the deployment of a software testing process at NeoIT, a small software deve... more This paper describes the deployment of a software testing process at NeoIT, a small software development company. The deployment was carried out by the instantiation of a generic testing process, taking into account the NeoIT particularities and needs. The deployed testing process, which was used in two development projects, provided a substantial improvement in the quality of products developed by the company. Resumo. Esse artigo descreve a implantação de um processo de teste na NeoIT, uma empresa de desenvolvimento de software de pequeno porte. A implantação foi feita pela instanciação de um processo genérico de teste, realizada considerando-se as necessidades e particularidades da NeoIT. O processo de teste implantado, utilizado em dois projetos de desenvolvimento, levou a uma substancial melhora da qualidade dos produtos desenvolvidos pela empresa.
2021 IEEE/ACS 18th International Conference on Computer Systems and Applications (AICCSA)
IEEE Data Engineering Bulletin, 1978
Data Base Engineering Bulletin is a quarterly publication of the IEEE Computer Society Technical ... more Data Base Engineering Bulletin is a quarterly publication of the IEEE Computer Society Technical Committee on Data Base Engineering. Its scope of interest includes: data structures and models, access strategies, access control techni ques, data base architecture, data base machines, intelligent front ends, mass storage for very large data bases, distributed data base problems and techniques, data base software design and implementation, data base utilities, etc. Contribution to the Bulletin is hereby solicited. News items, letters, techni cal papers, book reviews, meeting previews and summaries, etc., should be sent to the Editor. All letters to the Editor will be considered for publication unless accompanied by a request to the contrary. Technical papers are unrefereed. Opinions expressed in contributions are those of the individual author rather than the official position of the TC on Data Base Engineering, the IEEE Computer Society, or organizations with which the author may be affiliated. Membership in Data Base Engineering Technical Committee is open to IEEE Computer Society members, student members, and associate members.
a b s t r a c t We present a new test data generation technique which uses the concept of diversi... more a b s t r a c t We present a new test data generation technique which uses the concept of diversity of test sets as a basis for the diversity oriented test data generation -DOTG. Using DOTG we translate into an automatic test data generation technique the intuitive belief that increasing the variety, or diversity, of the test data used to test a program can lead to an improvement on the completeness, or quality, of the testing performed. We define the input domain perspective for diversity (DOTG-ID), which considers the distances among the test data in the program input domain to compute a diversity value for test sets. We describe metaheuristics which can be used to automate the generation of test sets for the DOTG-ID testing technique: simulated annealing; a genetic algorithm; and a proposed metaheuristic named simulated repulsion. The effectiveness of DOTG-ID was evaluated by using a Monte Carlo simulation, and also by applying the technique to test simple programs and measuring ...
Advances in Intelligent Systems and Computing
Virtual Local Area Network (VLAN) is a technology capable of separating networks into specific do... more Virtual Local Area Network (VLAN) is a technology capable of separating networks into specific domains. Attacks on VLANs could affect computing environments causing service interruptions. These attacks exploit vulnerabilities and operating characteristics of VLANs to gain access to critical information. Conceptual modeling of vulnerabilities and attacks related to VLANs is crucial to enable the construction of systematic methods and techniques for protecting critical infrastructures. Ontologies can contribute in this context, as they are modeling tools that enable the formalization of the main concepts and their relationships, in addition to enabling the creation of semantic rules that can be used by intelligent systems. We present a quasi-systematic literature review aiming at describing and classifying studies on ontologies of vulnerabilities and attacks on VLANs. The approach used in this review allowed for the verification and analysis of trends, as well as it uncovers the technological approaches adopted over the past 10 years. The main contributions of this review are: i) a description of the most recent ontologies, taxonomies, techniques and theories, in addition to the contributions and limitations of proposals in the literature; and ii) the identification of gaps in the literature and research challenges. Searches were carried
Anais do XI Simpósio Brasileiro de Qualidade de Software (SBQS 2012)
MPS tem sido utilizada com sucesso nas organizações intensivas em software geralmente com base em... more MPS tem sido utilizada com sucesso nas organizações intensivas em software geralmente com base em um único modelo de capacidade ou de maturidade da capacidade, tal como o CMMI-DEV ou MR MPS.BR. Porém tem crescido a necessidade de utilização de múltiplos modelos de referência para melhoria de processo. As soluções atuais envolvem geralmente a criação prévia de um novo modelo que integra um determinado conjunto de modelos. Este artigo apresenta um Metamodelo de Perfis de Capacidade de Processo, que utiliza os conceitos da MDE para a integração dinâmica de elementos de múltiplos modelos. Apresenta também a racionalidade, especificação, estratégia de implementação e um exemplo de uso do metamodelo.
ArXiv, 2017
The use of ontologies and taxonomies contributes by providing means to define concepts, minimize ... more The use of ontologies and taxonomies contributes by providing means to define concepts, minimize the ambiguity, improve the interoperability and manage knowledge of the security domain. Thus, this paper presents a literature survey on ontologies and taxonomies concerning the Security Assessment domain. We carried out it to uncover initiatives that aim at formalizing concepts from the Information Security and Test and Assessment fields of research. We applied a systematic review approach in seven scientific databases. 138 papers were identified and divided into categories according to their main contributions, namely: Ontology, Taxonomy and Survey. Based on their contents, we selected 47 papers on ontologies, 22 papers on taxonomies, and 11 papers on surveys. A taxonomy has been devised to be used in the evaluation of the papers. Summaries, tables, and a preliminary analysis of the selected works are presented. Our main contributions are: 1) an updated literature review, describing k...
O crescimento acentuado da Internet e a consequente demanda por novos servicos levam ao desenvolv... more O crescimento acentuado da Internet e a consequente demanda por novos servicos levam ao desenvolvimento de aplicacoes cada vez mais complexas. Devido a esta complexidade e as pressoes para o cumprimento de cronogramas cada vez mais restritivos, muitas vezes a seguranca desses sistemas acaba sendo negligenciada, originando aplicacoes com mais vulnerabilidades e portanto sujeitas a um maior numero de ataques. Um dos ataques mais frequentes e o do tipo buffer overflow. Permite que um atacante insira codigos maliciosos em um programa, alterando seu fluxo de controle original, com o objetivo de conseguir acesso a um sistema ou aumentar seus privilegios. Neste trabalho foi utilizada a tecnica denominada Teste de Vulnerabilidade de Seguranca proposta para detectar vulnerabilidade do tipo buffer overflow no software por meio da aplicacao do Teste de Mutacao. Uma ferramenta chamada SEVMUT - Security Vulnerabilities Mutation Tool foi desenvolvida e sua aplicabilidade, escalabilidade e eficacia foram validadas. Abstract