Gabriel Le Breton | Université du Québec à Chicoutimi (original) (raw)
Papers by Gabriel Le Breton
2014 19th International Conference on Engineering of Complex Computer Systems, 2014
Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in ... more Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in a web application that can be used to statically verify navigation properties. The automated extraction of an NSM from a running application is currently an open problem, as the output of existing web crawlers is not appropriate for model checking. This paper presents SiteHopper, a crawler that computes on-the-fly an abstraction of the NSM based on link and page contents. Experiments show that verification is sped up by many ...
WebMole is a browser-based tool that automatically and exhaustively explores all pages inside a w... more WebMole is a browser-based tool that automatically and exhaustively explores all pages inside a web application. Contrarily to classical web crawlers, which only explore pages accessible through regular anchors, WebMole can find its way through Ajax applications that use JavaScript-triggered links, and handles state changes that do not involve a page reload. User-defined functions called oracles can be used to bound the range of pages explored by WebMole to specific parts of an application, as well as to evaluate Boolean test conditions on all visited pages. Overall, WebMole can prove a more flexible alternative to automated testing suites such as Selenium WebDriver.
2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops, 2014
Web crawling is the process of exhaustively exploring the contents of a web site or application t... more Web crawling is the process of exhaustively exploring the contents of a web site or application through automated means. While the results of such a crawling can be put through numerous uses ranging from a simple backup to comprehensive testing and analysis, features of modernday applications prevent crawlers from properly exploring applications. We provide an in-depth analysis of 15 such features, and report on their presence in a study of 16 realworld web sites. Based on that study, we develop a configurable web application where the presence of each such feature can be turned on or off, aimed as a test bench where existing crawlers can be compared in a uniform way. Our results, which are the first exhaustive comparison of available crawlers, indicates areas where future work should be aimed.
Web Services and Formal Methods
Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in ... more Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in a web application that can be used to statically verify navigation properties. The automated extraction of an NSM from a running application is currently an open problem, as the output of existing web crawlers is not appropriate for model checking. This paper presents SiteHopper, a crawler that computes on-the-fly an abstraction of the NSM based on link and page contents. Experiments show that verification is sped up by many ...
The paper focuses on bugs in web applications that can be detected by analyzing the contents and ... more The paper focuses on bugs in web applications that can be detected by analyzing the contents and layout of page elements inside a browser’s window. Based on an empirical analysis of 35 real-world web sites and applications (such as Facebook, Dropbox, and Moodle), it provides a survey and classification of more than 90 instances of layout-based bugs. It then introduces Cornipickle, an automated testing tool that provides a declarative language to express desirable properties of a web application as a set of human-readable assertions on the page’s HTML and CSS data. Such properties can be verified on-the-fly as a user interacts with an application.
2014 19th International Conference on Engineering of Complex Computer Systems, 2014
Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in ... more Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in a web application that can be used to statically verify navigation properties. The automated extraction of an NSM from a running application is currently an open problem, as the output of existing web crawlers is not appropriate for model checking. This paper presents SiteHopper, a crawler that computes on-the-fly an abstraction of the NSM based on link and page contents. Experiments show that verification is sped up by many ...
WebMole is a browser-based tool that automatically and exhaustively explores all pages inside a w... more WebMole is a browser-based tool that automatically and exhaustively explores all pages inside a web application. Contrarily to classical web crawlers, which only explore pages accessible through regular anchors, WebMole can find its way through Ajax applications that use JavaScript-triggered links, and handles state changes that do not involve a page reload. User-defined functions called oracles can be used to bound the range of pages explored by WebMole to specific parts of an application, as well as to evaluate Boolean test conditions on all visited pages. Overall, WebMole can prove a more flexible alternative to automated testing suites such as Selenium WebDriver.
2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops, 2014
Web crawling is the process of exhaustively exploring the contents of a web site or application t... more Web crawling is the process of exhaustively exploring the contents of a web site or application through automated means. While the results of such a crawling can be put through numerous uses ranging from a simple backup to comprehensive testing and analysis, features of modernday applications prevent crawlers from properly exploring applications. We provide an in-depth analysis of 15 such features, and report on their presence in a study of 16 realworld web sites. Based on that study, we develop a configurable web application where the presence of each such feature can be turned on or off, aimed as a test bench where existing crawlers can be compared in a uniform way. Our results, which are the first exhaustive comparison of available crawlers, indicates areas where future work should be aimed.
Web Services and Formal Methods
Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in ... more Abstract. A Navigation State Machine (NSM) is a conceptual map of all possible page sequences in a web application that can be used to statically verify navigation properties. The automated extraction of an NSM from a running application is currently an open problem, as the output of existing web crawlers is not appropriate for model checking. This paper presents SiteHopper, a crawler that computes on-the-fly an abstraction of the NSM based on link and page contents. Experiments show that verification is sped up by many ...
The paper focuses on bugs in web applications that can be detected by analyzing the contents and ... more The paper focuses on bugs in web applications that can be detected by analyzing the contents and layout of page elements inside a browser’s window. Based on an empirical analysis of 35 real-world web sites and applications (such as Facebook, Dropbox, and Moodle), it provides a survey and classification of more than 90 instances of layout-based bugs. It then introduces Cornipickle, an automated testing tool that provides a declarative language to express desirable properties of a web application as a set of human-readable assertions on the page’s HTML and CSS data. Such properties can be verified on-the-fly as a user interacts with an application.