Neeraj Kumar Singh - Academia.edu (original) (raw)
Papers by Neeraj Kumar Singh
Information Studies, 2015
The paper examines 21397 global publications in cloud computing research, as covered in Scopus da... more The paper examines 21397 global publications in cloud computing research, as covered in Scopus database during 2004–13. The annual average growth rate is 96.77%. The global cloud computing research output comes from several countries, of which the top 15 most productive accounts for 90.07% share of the global output. The largest share of output (24.12% and 21.96%) came from China and USA, followed by Germany, India, UK, Taiwan, Australia, Italy, Japan, etc. Austria registered highest share (49.83%) of international collaborative papers among the top 15 countries in cloud computing during 2004–13, Computer science contributed the largest share (79.06%) during 2004–13, followed by engineering (21.20%), mathematics (15.47%), social sciences (4.79%), etc. The top 25 organizations and 20 authors contributed 16.16% and 3.87% share to the global output during 2004–13. Conference proceedings (62.92%) and journals (22.05%) contributed the largest share to global output during 2004–13. The top 20 journals contributed 21.81% share to its total journal output during 2004–13. Only 71 publications (out of total global publications) have registered 100 or more citations since their publication till June 2014 registering an citation impact per paper of 216.68 during 2004–13.
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT), Dec 23, 2022
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT)
Industrial System Engineering for Drones, 2019
The second half of the product development cycle is the board bring-up, which aids the final syst... more The second half of the product development cycle is the board bring-up, which aids the final system bring-up. No physical hardware or system parts are seen until the Gerber release. That is when the bring-up, the actual hardware, and mechanical parts are available for hands-on. The upcoming chapters will discuss more about the manufacturing, power on, testing, and validation of the actual hardware and the system. The pilot system build starts once the system is tested and validated against all the design and certification requirements as per the PRD. The system is deployed with the features list. Pilot systems are usually distributed to internal customers and are restricted to a lab environment until the system passes the regulatory and precompliance testing. Feedback and input from customers are consolidated to fine-tune the system further. Often, there is a separate tool for bug assignment, tracking, and issue resolution.
Communications in Computer and Information Science, 2020
This paper reports our experience for developing Human-Machine Interface (HMI) complying with ARI... more This paper reports our experience for developing Human-Machine Interface (HMI) complying with ARINC 661 specification standard for interactive cockpits applications using formal methods. This development relies on the FLUID modelling language, we have proposed and formally defined in the FORMEDICIS 1 project. FLUID contains essential features required for specifying HMI. To develop the MultiPurpose Interactive Applications (MPIA) use case, we follow the following steps: an abstract model of MPIA is written using the FLUID language; this MPIA FLUID model is used to produce an Event-B model for checking the functional behaviour, user interactions, safety properties, and interaction related to domain properties; the Event-B model is also used to check temporal properties and possible scenario using the ProB model checker; and finally, the MPIA FLUID model is translated to Interactive Cooperative Objects (ICO) using the PetShop CASE tool to validate the dynamic behaviour, visual properties and task analysis. These steps rely on different tools to check internal consistency along with possible HMI properties. Finally, the formal development of the MPIA case study using FLUID and its embedding into other formal techniques, demonstrates reliability, scalability and feasibility of our approach defined in the FORMEDICIS project.
Industrial System Engineering for Drones, 2019
JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH, 2019
A 20-year-old female presented to the emergency department with chief complaints of fever for two... more A 20-year-old female presented to the emergency department with chief complaints of fever for two days and altered behaviour from one day, without any episode of seizure. No similar episodes occurred in past neither was there a family history. On physical examination, her Glasgow Coma Scale (GCS) was E2V2M5, she was febrile (102°F), pulse rate was 90/minute regular and Blood Pressure (BP) was 100/70 mmHg. Neck rigidity and Kernig's sign were negative. On palpation of abdomen, there was splenomegaly 5 cm below left coastal margin. Fundus examination was normal. All other systemic examinations were within normal limit. Patient was suspected as a case of inflammatory brain disease like acute viral encephalitis, complicated malaria, pyogenic meningitis, septic encephalopathy, aseptic meningitis like leptospirosis, scrub typhus, dengue fever.
System on Chip Interfaces for Low Power Design, 2016
The Memory Interfaces chapter discusses, in detail, the various interfaces for volatile and nonvo... more The Memory Interfaces chapter discusses, in detail, the various interfaces for volatile and nonvolatile storage, their applicability to various scenarios, and their capabilities. After reading this chapter, the reader should understand when to use a particular interface and what advantages it has over other interfaces for a particular design.
System on Chip Interfaces for Low Power Design, 2016
Following up on the general discussion of various interfaces in earlier chapter, this chapter dis... more Following up on the general discussion of various interfaces in earlier chapter, this chapter discusses, in detail, the various interfaces used for power delivery, charging, and so on, their applicability to various scenarios, and their capabilities. After this chapter the reader should understand when to use a particular interface and what its advantages are.
Lecture Notes in Computer Science, 2013
The verification of distributed algorithms is a challenge for formal techniques supported by tool... more The verification of distributed algorithms is a challenge for formal techniques supported by tools, such as model checkers and proof assistants. The difficulties lie in the derivation of proofs of required properties, such as safety and eventuality, for distributed algorithms. In this paper, we present a methodology based on the general concept of refinement that is used for developing distributed algorithms satisfying a given list of safety and liveness properties. The methodology is a recipe for reusing the old ingredients of the classical temporal approaches, which are illustrated through standard example of routing protocols. More precisely, we show how the state-based models can be developed for specific problems and how they can be simply reused by controlling the composition of state-based models through the refinement relationship. The service-as-event paradigm is introduced for helping users to describe algorithms as a composition of simple services and/or to decompose them into simple steps. Consequently, we obtain a framework to derive new distributed algorithms by developing existing distributed algorithms using correct-by-construction approach. The correct-by-construction approach ensures the correctness of developed distributed algorithms.
Lecture Notes in Computer Science, 2011
This paper presents an incremental formal development of the Dynamic Source Routing (DSR) protoco... more This paper presents an incremental formal development of the Dynamic Source Routing (DSR) protocol in Event-B. DSR is a reactive routing protocol, which finds a route for a destination on demand, whenever communication is needed. Route discovery is an ...
Communications in Computer and Information Science, 2014
The failure of hardware or software in a critical system can lead to loss of lives. The design er... more The failure of hardware or software in a critical system can lead to loss of lives. The design errors can be main source of the failures that can be introduced during system development process. Formal techniques are an alternative approach to verify the correctness of critical systems, overcoming limitations of the traditional validation techniques such as simulation and testing. The increasing complexity and failure rate brings new challenges in the area of verification and validation of avionic systems. Since the reliability of the software cannot be quantified, the correct by construction approach can implement a reliable system. Refinement plays a major role to build a large system incrementally from an abstract specification to a concrete system. This paper contributes as a stepwise formal development of the landing system of an aircraft. The formal models include the complex behaviour, temporal behaviour and sequence of operations of the landing gear system. The models are formalized in Event-B modelling language, which supports stepwise refinement. This case study is considered as a benchmark for techniques and tools dedicated to the verification of behavioural properties of systems.
Innovations in Systems and Software Engineering, 2011
Model-driven development (MDD) is a very popular technique in the area of software development, b... more Model-driven development (MDD) is a very popular technique in the area of software development, but this technique is criticized due to lack of a formal semantics. MDD is used for large-scale system development using semiformal techniques like UML (Unified Modeling Language), which are not amenable to formal analysis and consistency checking. Formal methods with MDD may provide an assurance of correctness of the system. This paper advocates an approach to building generic framework for rigorous MDD that is based on combining semi-formal notations with formal modeling languages, correctness of the system using model checker and automatic code generation from the verified formal specification. The main objective of this work is to apply model-driven techniques and tools with formal verification and its code generation for designing critical systems. An assessment of the proposed framework is given through a case study, relative to the development of a cardiac pacemaker system.
Neurological Sciences, 2014
Epidemiologic findings suggest that lipids and alteration in lipid metabolizing protein/gene may ... more Epidemiologic findings suggest that lipids and alteration in lipid metabolizing protein/gene may contribute to the development of neurodegenerative disorders. The aim of the current study was to determine the serum lipid levels and genetic variation in two lipid metabolizing genes, low-density lipoprotein receptor-related proteinassociated protein (LRPAP1) and apolipoprotein E (APOE) gene in Parkinson's disease (PD). Based on well-defined inclusion and exclusion criteria, this study included 70 patients with PD and 100 age-matched controls. LRPAP1 and APOE gene polymorphism were analyzed by polymerase chain reaction and restriction fragment length polymorphism, respectively. Fasting serum lipid levels were determined using an autoanalyser. The logistic regression analysis showed that high levels of serum cholesterol [odds ratio (OR) = 1.101, 95 % confidence interval (CI 95%) = 1.067-1.135], LRPAP1 I allelic variant alone (OR = 2.766, CI 95% = 1.137-6.752) and in combination with APOE e4 allelic variant (OR = 4.187, CI 95% = 1.621-10.82) were significantly associated with increase in PD risk. Apart from that, the high levels of LDL cholesterol appears to have a protective role (OR = 0.931, CI 95% = 0.897-0.966) against PD. The LRPAP1 I allelic variant may be considered a candidate gene for PD, predominantly in patients having the APOE e4 allelic variant. Keywords Neurodegenerative disease Á Parkinson's disease Á APOE e4 allele Á LRPAP1 I allele Á Cholesterol Á LDL cholesterol
Today an evidence-based medicine has given number of medical practice clinical guidelines and pro... more Today an evidence-based medicine has given number of medical practice clinical guidelines and protocols. Clinical guidelines systematically assist practition-ers with providing appropriate health care for specific clinical circumstances. However, a significant number of ...
American Journal of Alzheimer's Disease & Other Dementiasr, 2012
Objectives: The aim was to examine the gene environment (GxE) interaction with reference to APO E... more Objectives: The aim was to examine the gene environment (GxE) interaction with reference to APO E genotypes, serum lipids and organochlorine pesticides (OCPs) as one of the factors in the etiology of Alzheimer’s disease (AD). Methods: A case control study was used to examine, APOE HhaI polymorphism by polymerase chain reaction (PCR)/PCRrestriction fragment length polymorphism method, serum lipids by autoanalyser and OCPs by gas chromatography (GC). Results: APOE ∊4 allele frequency was significantly high (p=0.000, OR=5.73, CI=2.68-12.50) in AD as compared to controls. The serum cholesterol, β- hexachlorocyclohexane and dieldrin are risk factors for AD independent of the APOE ∊4 risk allele, recording an odds ratio of 1.16, 11.38 and 10.45 respectively. Conclusion: GxE interactions exist with APOE ∊4 allele status that need to be considered for the study design and analysis of such data in future studies of AD.
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT)
Information Studies, 2015
The paper examines 21397 global publications in cloud computing research, as covered in Scopus da... more The paper examines 21397 global publications in cloud computing research, as covered in Scopus database during 2004–13. The annual average growth rate is 96.77%. The global cloud computing research output comes from several countries, of which the top 15 most productive accounts for 90.07% share of the global output. The largest share of output (24.12% and 21.96%) came from China and USA, followed by Germany, India, UK, Taiwan, Australia, Italy, Japan, etc. Austria registered highest share (49.83%) of international collaborative papers among the top 15 countries in cloud computing during 2004–13, Computer science contributed the largest share (79.06%) during 2004–13, followed by engineering (21.20%), mathematics (15.47%), social sciences (4.79%), etc. The top 25 organizations and 20 authors contributed 16.16% and 3.87% share to the global output during 2004–13. Conference proceedings (62.92%) and journals (22.05%) contributed the largest share to global output during 2004–13. The top 20 journals contributed 21.81% share to its total journal output during 2004–13. Only 71 publications (out of total global publications) have registered 100 or more citations since their publication till June 2014 registering an citation impact per paper of 216.68 during 2004–13.
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT), Dec 23, 2022
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT)
Industrial System Engineering for Drones, 2019
The second half of the product development cycle is the board bring-up, which aids the final syst... more The second half of the product development cycle is the board bring-up, which aids the final system bring-up. No physical hardware or system parts are seen until the Gerber release. That is when the bring-up, the actual hardware, and mechanical parts are available for hands-on. The upcoming chapters will discuss more about the manufacturing, power on, testing, and validation of the actual hardware and the system. The pilot system build starts once the system is tested and validated against all the design and certification requirements as per the PRD. The system is deployed with the features list. Pilot systems are usually distributed to internal customers and are restricted to a lab environment until the system passes the regulatory and precompliance testing. Feedback and input from customers are consolidated to fine-tune the system further. Often, there is a separate tool for bug assignment, tracking, and issue resolution.
Communications in Computer and Information Science, 2020
This paper reports our experience for developing Human-Machine Interface (HMI) complying with ARI... more This paper reports our experience for developing Human-Machine Interface (HMI) complying with ARINC 661 specification standard for interactive cockpits applications using formal methods. This development relies on the FLUID modelling language, we have proposed and formally defined in the FORMEDICIS 1 project. FLUID contains essential features required for specifying HMI. To develop the MultiPurpose Interactive Applications (MPIA) use case, we follow the following steps: an abstract model of MPIA is written using the FLUID language; this MPIA FLUID model is used to produce an Event-B model for checking the functional behaviour, user interactions, safety properties, and interaction related to domain properties; the Event-B model is also used to check temporal properties and possible scenario using the ProB model checker; and finally, the MPIA FLUID model is translated to Interactive Cooperative Objects (ICO) using the PetShop CASE tool to validate the dynamic behaviour, visual properties and task analysis. These steps rely on different tools to check internal consistency along with possible HMI properties. Finally, the formal development of the MPIA case study using FLUID and its embedding into other formal techniques, demonstrates reliability, scalability and feasibility of our approach defined in the FORMEDICIS project.
Industrial System Engineering for Drones, 2019
JOURNAL OF CLINICAL AND DIAGNOSTIC RESEARCH, 2019
A 20-year-old female presented to the emergency department with chief complaints of fever for two... more A 20-year-old female presented to the emergency department with chief complaints of fever for two days and altered behaviour from one day, without any episode of seizure. No similar episodes occurred in past neither was there a family history. On physical examination, her Glasgow Coma Scale (GCS) was E2V2M5, she was febrile (102°F), pulse rate was 90/minute regular and Blood Pressure (BP) was 100/70 mmHg. Neck rigidity and Kernig's sign were negative. On palpation of abdomen, there was splenomegaly 5 cm below left coastal margin. Fundus examination was normal. All other systemic examinations were within normal limit. Patient was suspected as a case of inflammatory brain disease like acute viral encephalitis, complicated malaria, pyogenic meningitis, septic encephalopathy, aseptic meningitis like leptospirosis, scrub typhus, dengue fever.
System on Chip Interfaces for Low Power Design, 2016
The Memory Interfaces chapter discusses, in detail, the various interfaces for volatile and nonvo... more The Memory Interfaces chapter discusses, in detail, the various interfaces for volatile and nonvolatile storage, their applicability to various scenarios, and their capabilities. After reading this chapter, the reader should understand when to use a particular interface and what advantages it has over other interfaces for a particular design.
System on Chip Interfaces for Low Power Design, 2016
Following up on the general discussion of various interfaces in earlier chapter, this chapter dis... more Following up on the general discussion of various interfaces in earlier chapter, this chapter discusses, in detail, the various interfaces used for power delivery, charging, and so on, their applicability to various scenarios, and their capabilities. After this chapter the reader should understand when to use a particular interface and what its advantages are.
Lecture Notes in Computer Science, 2013
The verification of distributed algorithms is a challenge for formal techniques supported by tool... more The verification of distributed algorithms is a challenge for formal techniques supported by tools, such as model checkers and proof assistants. The difficulties lie in the derivation of proofs of required properties, such as safety and eventuality, for distributed algorithms. In this paper, we present a methodology based on the general concept of refinement that is used for developing distributed algorithms satisfying a given list of safety and liveness properties. The methodology is a recipe for reusing the old ingredients of the classical temporal approaches, which are illustrated through standard example of routing protocols. More precisely, we show how the state-based models can be developed for specific problems and how they can be simply reused by controlling the composition of state-based models through the refinement relationship. The service-as-event paradigm is introduced for helping users to describe algorithms as a composition of simple services and/or to decompose them into simple steps. Consequently, we obtain a framework to derive new distributed algorithms by developing existing distributed algorithms using correct-by-construction approach. The correct-by-construction approach ensures the correctness of developed distributed algorithms.
Lecture Notes in Computer Science, 2011
This paper presents an incremental formal development of the Dynamic Source Routing (DSR) protoco... more This paper presents an incremental formal development of the Dynamic Source Routing (DSR) protocol in Event-B. DSR is a reactive routing protocol, which finds a route for a destination on demand, whenever communication is needed. Route discovery is an ...
Communications in Computer and Information Science, 2014
The failure of hardware or software in a critical system can lead to loss of lives. The design er... more The failure of hardware or software in a critical system can lead to loss of lives. The design errors can be main source of the failures that can be introduced during system development process. Formal techniques are an alternative approach to verify the correctness of critical systems, overcoming limitations of the traditional validation techniques such as simulation and testing. The increasing complexity and failure rate brings new challenges in the area of verification and validation of avionic systems. Since the reliability of the software cannot be quantified, the correct by construction approach can implement a reliable system. Refinement plays a major role to build a large system incrementally from an abstract specification to a concrete system. This paper contributes as a stepwise formal development of the landing system of an aircraft. The formal models include the complex behaviour, temporal behaviour and sequence of operations of the landing gear system. The models are formalized in Event-B modelling language, which supports stepwise refinement. This case study is considered as a benchmark for techniques and tools dedicated to the verification of behavioural properties of systems.
Innovations in Systems and Software Engineering, 2011
Model-driven development (MDD) is a very popular technique in the area of software development, b... more Model-driven development (MDD) is a very popular technique in the area of software development, but this technique is criticized due to lack of a formal semantics. MDD is used for large-scale system development using semiformal techniques like UML (Unified Modeling Language), which are not amenable to formal analysis and consistency checking. Formal methods with MDD may provide an assurance of correctness of the system. This paper advocates an approach to building generic framework for rigorous MDD that is based on combining semi-formal notations with formal modeling languages, correctness of the system using model checker and automatic code generation from the verified formal specification. The main objective of this work is to apply model-driven techniques and tools with formal verification and its code generation for designing critical systems. An assessment of the proposed framework is given through a case study, relative to the development of a cardiac pacemaker system.
Neurological Sciences, 2014
Epidemiologic findings suggest that lipids and alteration in lipid metabolizing protein/gene may ... more Epidemiologic findings suggest that lipids and alteration in lipid metabolizing protein/gene may contribute to the development of neurodegenerative disorders. The aim of the current study was to determine the serum lipid levels and genetic variation in two lipid metabolizing genes, low-density lipoprotein receptor-related proteinassociated protein (LRPAP1) and apolipoprotein E (APOE) gene in Parkinson's disease (PD). Based on well-defined inclusion and exclusion criteria, this study included 70 patients with PD and 100 age-matched controls. LRPAP1 and APOE gene polymorphism were analyzed by polymerase chain reaction and restriction fragment length polymorphism, respectively. Fasting serum lipid levels were determined using an autoanalyser. The logistic regression analysis showed that high levels of serum cholesterol [odds ratio (OR) = 1.101, 95 % confidence interval (CI 95%) = 1.067-1.135], LRPAP1 I allelic variant alone (OR = 2.766, CI 95% = 1.137-6.752) and in combination with APOE e4 allelic variant (OR = 4.187, CI 95% = 1.621-10.82) were significantly associated with increase in PD risk. Apart from that, the high levels of LDL cholesterol appears to have a protective role (OR = 0.931, CI 95% = 0.897-0.966) against PD. The LRPAP1 I allelic variant may be considered a candidate gene for PD, predominantly in patients having the APOE e4 allelic variant. Keywords Neurodegenerative disease Á Parkinson's disease Á APOE e4 allele Á LRPAP1 I allele Á Cholesterol Á LDL cholesterol
Today an evidence-based medicine has given number of medical practice clinical guidelines and pro... more Today an evidence-based medicine has given number of medical practice clinical guidelines and protocols. Clinical guidelines systematically assist practition-ers with providing appropriate health care for specific clinical circumstances. However, a significant number of ...
American Journal of Alzheimer's Disease & Other Dementiasr, 2012
Objectives: The aim was to examine the gene environment (GxE) interaction with reference to APO E... more Objectives: The aim was to examine the gene environment (GxE) interaction with reference to APO E genotypes, serum lipids and organochlorine pesticides (OCPs) as one of the factors in the etiology of Alzheimer’s disease (AD). Methods: A case control study was used to examine, APOE HhaI polymorphism by polymerase chain reaction (PCR)/PCRrestriction fragment length polymorphism method, serum lipids by autoanalyser and OCPs by gas chromatography (GC). Results: APOE ∊4 allele frequency was significantly high (p=0.000, OR=5.73, CI=2.68-12.50) in AD as compared to controls. The serum cholesterol, β- hexachlorocyclohexane and dieldrin are risk factors for AD independent of the APOE ∊4 risk allele, recording an odds ratio of 1.16, 11.38 and 10.45 respectively. Conclusion: GxE interactions exist with APOE ∊4 allele status that need to be considered for the study design and analysis of such data in future studies of AD.
2022 2nd International Conference on Innovative Sustainable Computational Technologies (CISCT)