Ramneek Paul Singh | University of Ottawa | Université d'Ottawa (original) (raw)
Uploads
Papers by Ramneek Paul Singh
Syme, G., Hatton MacDonald, D., Fulton, B. and Piantadosi, J. (eds) MODSIM2017, 22nd International Congress on Modelling and Simulation., 2017
NetCDF (Network Common Data Form) is a data format used commonly to store scientific arrayoriente... more NetCDF (Network Common Data Form) is a data format used commonly to store scientific arrayoriented data. A number of software tools exist that can be used to process and view NetCDF data. In some cases though, the existing tools cannot do the processing required and thus, we have to write code for some of the data processing operations. One such case is when we have to merge two NetCDF files. Although the core library for read-write access to NetCDF files is written in C, interfaces to the C library are available in a number of languages including Python, C#, Perl, R and others. Python with the Numpy package is widely used in scientific computing where the Numpy package provides for fast and sophisticated handling of the arrays in Python. NetCDF being an array-oriented data format, Python/Numpy combination is a good candidate for NetCDF processing. But in terms of performance, Python is not as fast as C. So if we are looping over the NetCDF data, as is the case when we merge two files, then as the dimension size of the arrays increases, so does the computing time. And if we have to merge 1000's of such files, then we have to look for opportunities to speed up the process. The processing time can be shortened by extracting the 'looping' code out to a custom C module which can then be called from the Python code. The C 'looping' code can then further be parallelised using OpenMP. This gives us best of both worlds, ease of development in Python and fast execution time in C. Problem setup can also reduce processing time if the files are sorted in such a way that the adjoining files show increasing overlap in the dimensions. And if one has access to a cluster of machines, then exploiting the parallelism at a coarser level by running multiple merge processes simultaneously will expedite the process more so than just parallelizing the loop given that the number of machines available is more than the cores in a single machine. There also exist other use cases like where you have to provide aggregations and averages for NetCDF data. Existing third party tools like NCO and CDO, which themselves are written in C, cannot handle all the scenarios, like leap years or have other undesirable effects on the NetCDF files, like renaming the dimensions. For these cases as well, custom code has to be written to provide a coherent, portable, extensible and fast solution. C/C++ again is a good choice for writing such code keeping in mind the performance benefits it provides for processing large datasets. Modern C++ specially with STL, lambda functions support and smart pointers makes a compelling case to be used over plain C as it does away with issues like dangling pointers and memory allocation for dynamic arrays. Lambda functions combined with STL result in very expressive and concise code unlike the procedural C code.
Materials Today: Proceedings, 2021
Abstract In this study Taguchi method is applied to optimize the cutting parameters of CNC Turnin... more Abstract In this study Taguchi method is applied to optimize the cutting parameters of CNC Turning (Spindle speed (rpm), Feed rate (mm/min), Depth of cut (mm)). Machining parameter for this investigation are applied on Brass 63/37. Cutting parameters are optimized to obtain better surface finishing. Cemented carbide insert (CNMG120408-MS HTi10 Mitsubishi) is used for machining of brass63/37. Aim of this study is to optimize the turning parameters for better surface finishing. L27 orthogonal array (Taguchi method) is used to conduct experiment and to analyze the results through main effect plot, ANOVA and 3D scatter Plot. Optimum conditions for minimum average roughness value (Ra) are spindlespeed 1400 rpm, feed rate 100 mm/min and depth of cut (DOC) 0.5 mm. Main effect plot provides significant evidence to support the main aim of the investigation. Furthermore, according to ANOVA table spindlespeed contributes 65.81% and contribution of feedrate is 31.38%. Whereas, DOC contributes only for 0.95%. During this study it is observed that surface roughness can be manipulated by spindle speed and feed rate. Considering the results spindle speed and feed rate are the main factors responsible for influencing surface roughness.
Scandinavian Journal of Plastic and Reconstructive Surgery and Hand Surgery, 2004
Despite refinements in surgical technique, including bone grafting and sophisticated prosthetic r... more Despite refinements in surgical technique, including bone grafting and sophisticated prosthetic reconstructions, there are limitations to what can be achieved with bone-anchored fixed prostheses in patients with advanced atrophy of the maxillae. A new approach was suggested by a long-term study on onlay bone grafting and simultaneous placement of a fixture based on a new design: the zygoma fixture, and the aim of this study was to assess its potential. Twenty-eight consecutive patients with severely resorbed edentulous maxillae were included, 13 of whom had previously had multiple fixture surgery in the jawbone that had failed. A total of 52 zygoma fixtures and 106 conventional fixtures were installed. Bone grafting was deemed necessary in 17 patients. All patients have been followed for at least five years, and nine for up to 10 years. All patients were followed up with clinical and radiographic examinations, and in some cases rhinoscopy and sinoscopy as well. Three zygoma fixtures failed; two at the time of connection of the abutment and the third after six years. Of the conventional fixtures placed at the time of the zygoma fixture, 29 (27%) were lost. The overall prosthetic rehabilitation rate was 96% after at least five years of function. There were no signs of inflammatory reaction in the surrounding antral mucosa. Four patients with recurrent sinusitis recovered after inferior meatal antrostomy. To conclude, the zygoma fixture seems to be a valuable addition to our repertoire in the management of the compromised maxilla.
Journal of Hydroinformatics, 2015
This paper outlines the application and usefulness of a software platform that enables hydrologis... more This paper outlines the application and usefulness of a software platform that enables hydrologists to develop custom functionality in a new hydrological modelling tool, eWater Source, designed for water resources planning and management. The flexible architecture of the software allows incorporation of third-party components as plug-ins to add new capabilities that are not built in. Plug-ins can be developed to adapt the software to suit the needs of hydrologists with modest software development knowledge. This can result in an improvement in workflow and efficiencies. In addition, modellers can use plug-ins to integrate hydrological process and management models that may not be able to be built in the normal tool. The paper introduces the plug-ins functionality of the modelling tool, its design and applications with three example plug-ins to demonstrate. These are: (1) a data processing plug-in to upscale urban environment models; (2) a management rule plug-in to calculate loss al...
International Journal of Computer Applications, 2011
The bit error rate (BER) performance of the Code Division Multiple Access (CDMA) cellular system ... more The bit error rate (BER) performance of the Code Division Multiple Access (CDMA) cellular system based on IS-95 standard in the presence of an additive white Gaussian noise (AWGN) and interference has been investigated in this paper. The performance is evaluated under two types of decision feedback receivers for the CDMA reverse link. These two feedback receivers are: (a) Hard decision Viterbi decoder in which coded bit is estimated based on Hamming Distance method and (b) Soft decision Viterbi decoder in which Euclidean Distance method is used for coded bit estimation. The comparison of these two techniques of decision feedback receivers of CDMA is done under AWGN channel. The performance of CDMA system is shown in graphs between BER versus Energy per bit to Noise Ratio i.e. Eb/No ratio.
Weber, T., McPhee, M.J. and Anderssen, R.S. (eds) MODSIM2015, 21st International Congress on Modelling and Simulation, Nov 29, 2015
Using optimisers to calibrate hydrological models is a computationally intensive process. Most op... more Using optimisers to calibrate hydrological models is a computationally intensive process. Most optimisation algorithms run on desktop machines, with some running on Linux clusters and a couple that run on cloud infrastructure (e.g. cloudPEST). Complex hydrological models require a relatively powerful machine and calibration runtimes vary from an hour or less, to days and sometimes weeks. Increasingly, organisations are looking to outsource provision and management of computationally intensive infrastructures. While virtualisation technology can provide similar performance to high end desktops, there are opportunities to harness parallelisation and reduce calibration times, by hosting the modelling software on the cloud infrastructure and exposing its functionality through web services. This paper investigates the practicality and performance of implementing a calibration wrapper to the eWater Source river modelling package. The Source calibration service allows user to calibrate models, where the modelling software, eWater Source, is running on the cloud and not on end user's premises. The aim of this analysis was to compare the performance characteristics of a simple GR4J model for the Legerwood catchment using eWater Source running as desktop software versus running as a Source calibration service on the cloud. Shuffle Complex Evolution was used as the parameter optimisation algorithm for the GR4J model parameters. The eWater Source product running as desktop software took around 4 minutes to calibrate the model whereas the Source calibration service took around 73 minutes to do the same calibration with similar results. The difference in run times can be attributed either to: 1) the chatty nature of communication between the machines running the eWater Source and the optimization algorithm; and/or 2) time inefficient implementation of SCEoptim routine from the hydromad package; and/or 3) performance bottle necks in Source's external interface which exposes eWater Source modelling capability through command prompt. Given the long simulation runtimes, the current Source calibration service fails to meet expectations of hydrological model builders for improved performance. For software implementers, we would recommend careful attention to the software architecture and performance characteristics of proposed cloud-based software implementations early in development. In this case, we anticipate future improvements to the infrastructure, or renewed effort improving the implementation would lead to a faster implementation.
Syme, G., Hatton MacDonald, D., Fulton, B. and Piantadosi, J. (eds) MODSIM2017, 22nd International Congress on Modelling and Simulation., 2017
NetCDF (Network Common Data Form) is a data format used commonly to store scientific arrayoriente... more NetCDF (Network Common Data Form) is a data format used commonly to store scientific arrayoriented data. A number of software tools exist that can be used to process and view NetCDF data. In some cases though, the existing tools cannot do the processing required and thus, we have to write code for some of the data processing operations. One such case is when we have to merge two NetCDF files. Although the core library for read-write access to NetCDF files is written in C, interfaces to the C library are available in a number of languages including Python, C#, Perl, R and others. Python with the Numpy package is widely used in scientific computing where the Numpy package provides for fast and sophisticated handling of the arrays in Python. NetCDF being an array-oriented data format, Python/Numpy combination is a good candidate for NetCDF processing. But in terms of performance, Python is not as fast as C. So if we are looping over the NetCDF data, as is the case when we merge two files, then as the dimension size of the arrays increases, so does the computing time. And if we have to merge 1000's of such files, then we have to look for opportunities to speed up the process. The processing time can be shortened by extracting the 'looping' code out to a custom C module which can then be called from the Python code. The C 'looping' code can then further be parallelised using OpenMP. This gives us best of both worlds, ease of development in Python and fast execution time in C. Problem setup can also reduce processing time if the files are sorted in such a way that the adjoining files show increasing overlap in the dimensions. And if one has access to a cluster of machines, then exploiting the parallelism at a coarser level by running multiple merge processes simultaneously will expedite the process more so than just parallelizing the loop given that the number of machines available is more than the cores in a single machine. There also exist other use cases like where you have to provide aggregations and averages for NetCDF data. Existing third party tools like NCO and CDO, which themselves are written in C, cannot handle all the scenarios, like leap years or have other undesirable effects on the NetCDF files, like renaming the dimensions. For these cases as well, custom code has to be written to provide a coherent, portable, extensible and fast solution. C/C++ again is a good choice for writing such code keeping in mind the performance benefits it provides for processing large datasets. Modern C++ specially with STL, lambda functions support and smart pointers makes a compelling case to be used over plain C as it does away with issues like dangling pointers and memory allocation for dynamic arrays. Lambda functions combined with STL result in very expressive and concise code unlike the procedural C code.
Materials Today: Proceedings, 2021
Abstract In this study Taguchi method is applied to optimize the cutting parameters of CNC Turnin... more Abstract In this study Taguchi method is applied to optimize the cutting parameters of CNC Turning (Spindle speed (rpm), Feed rate (mm/min), Depth of cut (mm)). Machining parameter for this investigation are applied on Brass 63/37. Cutting parameters are optimized to obtain better surface finishing. Cemented carbide insert (CNMG120408-MS HTi10 Mitsubishi) is used for machining of brass63/37. Aim of this study is to optimize the turning parameters for better surface finishing. L27 orthogonal array (Taguchi method) is used to conduct experiment and to analyze the results through main effect plot, ANOVA and 3D scatter Plot. Optimum conditions for minimum average roughness value (Ra) are spindlespeed 1400 rpm, feed rate 100 mm/min and depth of cut (DOC) 0.5 mm. Main effect plot provides significant evidence to support the main aim of the investigation. Furthermore, according to ANOVA table spindlespeed contributes 65.81% and contribution of feedrate is 31.38%. Whereas, DOC contributes only for 0.95%. During this study it is observed that surface roughness can be manipulated by spindle speed and feed rate. Considering the results spindle speed and feed rate are the main factors responsible for influencing surface roughness.
Scandinavian Journal of Plastic and Reconstructive Surgery and Hand Surgery, 2004
Despite refinements in surgical technique, including bone grafting and sophisticated prosthetic r... more Despite refinements in surgical technique, including bone grafting and sophisticated prosthetic reconstructions, there are limitations to what can be achieved with bone-anchored fixed prostheses in patients with advanced atrophy of the maxillae. A new approach was suggested by a long-term study on onlay bone grafting and simultaneous placement of a fixture based on a new design: the zygoma fixture, and the aim of this study was to assess its potential. Twenty-eight consecutive patients with severely resorbed edentulous maxillae were included, 13 of whom had previously had multiple fixture surgery in the jawbone that had failed. A total of 52 zygoma fixtures and 106 conventional fixtures were installed. Bone grafting was deemed necessary in 17 patients. All patients have been followed for at least five years, and nine for up to 10 years. All patients were followed up with clinical and radiographic examinations, and in some cases rhinoscopy and sinoscopy as well. Three zygoma fixtures failed; two at the time of connection of the abutment and the third after six years. Of the conventional fixtures placed at the time of the zygoma fixture, 29 (27%) were lost. The overall prosthetic rehabilitation rate was 96% after at least five years of function. There were no signs of inflammatory reaction in the surrounding antral mucosa. Four patients with recurrent sinusitis recovered after inferior meatal antrostomy. To conclude, the zygoma fixture seems to be a valuable addition to our repertoire in the management of the compromised maxilla.
Journal of Hydroinformatics, 2015
This paper outlines the application and usefulness of a software platform that enables hydrologis... more This paper outlines the application and usefulness of a software platform that enables hydrologists to develop custom functionality in a new hydrological modelling tool, eWater Source, designed for water resources planning and management. The flexible architecture of the software allows incorporation of third-party components as plug-ins to add new capabilities that are not built in. Plug-ins can be developed to adapt the software to suit the needs of hydrologists with modest software development knowledge. This can result in an improvement in workflow and efficiencies. In addition, modellers can use plug-ins to integrate hydrological process and management models that may not be able to be built in the normal tool. The paper introduces the plug-ins functionality of the modelling tool, its design and applications with three example plug-ins to demonstrate. These are: (1) a data processing plug-in to upscale urban environment models; (2) a management rule plug-in to calculate loss al...
International Journal of Computer Applications, 2011
The bit error rate (BER) performance of the Code Division Multiple Access (CDMA) cellular system ... more The bit error rate (BER) performance of the Code Division Multiple Access (CDMA) cellular system based on IS-95 standard in the presence of an additive white Gaussian noise (AWGN) and interference has been investigated in this paper. The performance is evaluated under two types of decision feedback receivers for the CDMA reverse link. These two feedback receivers are: (a) Hard decision Viterbi decoder in which coded bit is estimated based on Hamming Distance method and (b) Soft decision Viterbi decoder in which Euclidean Distance method is used for coded bit estimation. The comparison of these two techniques of decision feedback receivers of CDMA is done under AWGN channel. The performance of CDMA system is shown in graphs between BER versus Energy per bit to Noise Ratio i.e. Eb/No ratio.
Weber, T., McPhee, M.J. and Anderssen, R.S. (eds) MODSIM2015, 21st International Congress on Modelling and Simulation, Nov 29, 2015
Using optimisers to calibrate hydrological models is a computationally intensive process. Most op... more Using optimisers to calibrate hydrological models is a computationally intensive process. Most optimisation algorithms run on desktop machines, with some running on Linux clusters and a couple that run on cloud infrastructure (e.g. cloudPEST). Complex hydrological models require a relatively powerful machine and calibration runtimes vary from an hour or less, to days and sometimes weeks. Increasingly, organisations are looking to outsource provision and management of computationally intensive infrastructures. While virtualisation technology can provide similar performance to high end desktops, there are opportunities to harness parallelisation and reduce calibration times, by hosting the modelling software on the cloud infrastructure and exposing its functionality through web services. This paper investigates the practicality and performance of implementing a calibration wrapper to the eWater Source river modelling package. The Source calibration service allows user to calibrate models, where the modelling software, eWater Source, is running on the cloud and not on end user's premises. The aim of this analysis was to compare the performance characteristics of a simple GR4J model for the Legerwood catchment using eWater Source running as desktop software versus running as a Source calibration service on the cloud. Shuffle Complex Evolution was used as the parameter optimisation algorithm for the GR4J model parameters. The eWater Source product running as desktop software took around 4 minutes to calibrate the model whereas the Source calibration service took around 73 minutes to do the same calibration with similar results. The difference in run times can be attributed either to: 1) the chatty nature of communication between the machines running the eWater Source and the optimization algorithm; and/or 2) time inefficient implementation of SCEoptim routine from the hydromad package; and/or 3) performance bottle necks in Source's external interface which exposes eWater Source modelling capability through command prompt. Given the long simulation runtimes, the current Source calibration service fails to meet expectations of hydrological model builders for improved performance. For software implementers, we would recommend careful attention to the software architecture and performance characteristics of proposed cloud-based software implementations early in development. In this case, we anticipate future improvements to the infrastructure, or renewed effort improving the implementation would lead to a faster implementation.