Shireesh Verma - Academia.edu (original) (raw)
Papers by Shireesh Verma
Design, Automation, and Test in Europe, Apr 16, 2007
Ol 02 03 04 05 06 07 08 09 10 11 12 coverage_group example { sample_event = @ (posedge CLOCK); sa... more Ol 02 03 04 05 06 07 08 09 10 11 12 coverage_group example { sample_event = @ (posedge CLOCK); sample a, wr-en, QblPi, Qb,P2\ cross func_cov (a, wr_era, QblPi, Qb,j>2]{ state cvg_l ((a == l)&&(u>r_en == l)&&(g¡,jPi == 1)) state cvg_2 ((a == l)&&(wr_era == 0)&8¿(qb]p2 == ...
With the widespread use of PDA’s for communicating purposes it has become essentially important t... more With the widespread use of PDA’s for communicating purposes it has become essentially important to address this issue. One of the means of communication is using multicasting framework for sending messages etc. to a group of people. In this report we highlight the approach taken by us for implementing a multicasting framework for a real time operating system eCos and also some simulation results. We also illustrate the analysis done for improving the multicasting in wireless networks by the use of mobility prediction.
2009 10th International Symposium on Quality of Electronic Design, 2009
Abstract We are at the crossroads of some fundamental changes that are taking place in the semico... more Abstract We are at the crossroads of some fundamental changes that are taking place in the semiconductor industry. Power is a primary design criterion for bulk of the semiconductor designs now and a key reason behind the shift towards multi-core designs as increase in power ...
Design simulation and model checking are two alternative and complementary techniques for verifyi... more Design simulation and model checking are two alternative and complementary techniques for verifying hardware designs. This paper presents a comparison between the two techniques based on their detection of design errors, performance, and memory use. While memory and CPU performance gains in simulation-based validation over model checking verification methods are to be expected, the ability of simulation-based validation in detecting errors is comparable to that of model checking. We perform error detection experiments using model checking and simulation to detect errors injected into a verification benchmark suite. The results allow a quantitative comparison of simulation and model checking which can be used to understand weaknesses of both approaches. We see that simulation-based validation is effective, and where it is not, we define test generation goals which make it effective.
This report presents a methodology towards a complete software implementation of G.729 Annexe E o... more This report presents a methodology towards a complete software implementation of G.729 Annexe E on ARM926EJ-S processor running at 200 MHz clock speed. The methodology spans both, target independent as well as target dependent optimizations. The main focus is towards code speed up to reduce the per frame execution time as much below the 10 ms constraint as possible. Although no constraint is given for code size, the outcome of the optimization process has so far resulted in an overall reduction of the code size. The code was also run on an Aptix FPGA board to calibrate the ARMulator (ARM instruction set simulator). Although the optimizations presented here, were performed and analyzed on the G.729E
In course of system-level design, designers need an efficient system level design language (SLDL)... more In course of system-level design, designers need an efficient system level design language (SLDL), which can serve as the design vehicle. The complexity of the design process at this level is determined upto an extent, by the semantics and syntax definition of the SLDL being used. This report first analyzes the system-level design flow in order to establish the requirements on an SLDL. It then compares SpecC and SystemC, the two popular SLDLs, in terms of the extent they meet these requirements. Finally, it provides the essential modeling guidelines for both the SpecC and SystemC users for the cases where the constructs or the features of the respective languages give rise to an ambiguous design.
In course of system-level design, designers need an efficient system level design language (SLDL)... more In course of system-level design, designers need an efficient system level design language (SLDL), which can serve as the design vehicle. The complexity of the design process at this level is determined upto an extent, by the semantics and syntax definition of the SLDL being used. This report first analyzes the system-level design flow in order to establish the requirements on an SLDL. It then compares SpecC and SystemC, the two popular SLDLs, in terms of the extent they meet these requirements. Finally, it provides the essential modeling guidelines for both the SpecC and SystemC users for the cases where the constructs or the features of the respective languages give rise to an ambiguous design.
Journal of Low Power Electronics, 2011
ABSTRACT
Journal of Low Power Electronics, 2011
Practical Design Verification, 2009
2008 Ninth International Workshop on Microprocessor Test and Verification, 2008
Abstract Power consumption has become one of the most important differentiating factors for semi... more Abstract Power consumption has become one of the most important differentiating factors for semiconductor products. Voltage is the strongest handle for managing chip power consumption. We look in detail at some of key power management techniques such as ...
Journal of Computer Science, 2009
Problem statement: Dynamic verification, the use of simulation to determine design correctness, i... more Problem statement: Dynamic verification, the use of simulation to determine design correctness, is widely used due to its tractability for large hardware designs. A serious limitation of dynamic techniques is the difficulty in determining whether or not a test sequence is sufficient to detect all likely design errors. Coverage metrics are used to address this problem by providing a set of goals to be achieved during the simulation process; if all coverage goals are satisfied then the test sequence is assumed to be complete. Coverage metrics hence evaluate the ability of a test sequence to detect design errors and are essential to the verification process. A key source of difficulty in determining error detection is that the control-flow path traversed in the presence of an error cannot be determined. This problem becomes particularly difficult in case of typical industrial designs involving interaction of control flow paths between concurrent processes. Error detection can only be accurately determined by exploring the set of all control-flow paths, which may be traversed as a result of an error. Also, there is no technique to identify a correlation between coverage metrics and hardware design quality. Approach: We present a coverage metric that determined the propagation of error effects along all possible erroneous control-flow paths across processes. The complexity of exploring multiple controlflow paths was greatly alleviated by heuristically pruning infeasible control-flow paths using the algorithm that we present. We also presented a technique to evaluate coverage metric by examining its ability to ensure the detection of real design errors. We injected errors in the design to correlate their detection with the coverage computed by our metric. Results: Our coverage metric although analyzed all control-flow paths it pruned the infeasible ones and eliminated them from coverage consideration, hence reducing the complexity of generating tests meant to execute them. The metric also correlated better with detection of design errors than some well-studied metrics do. Conclusion: The proposed coverage metric provided high accuracy in measurement of coverage in designs that contain complex control-flow with concurrent processes. It is superior at detecting design error when compared with the metrics it was compared with.
Proceedings of the ASP-DAC 2005. Asia and South Pacific Design Automation Conference, 2005., 2005
Coverage metrics, which evaluate the ability of a test sequence to detect design faults, are esse... more Coverage metrics, which evaluate the ability of a test sequence to detect design faults, are essential to the validation process. A key source of difficulty in determining fault detection is that the control flow path traversed in the presence of a fault cannot be determined. Fault detection can only be accurately determined by exploring the set of all control flow paths, which may be traversed as a result of a fault. We present a coverage metric that determines the propagation of fault effects along all possible faulty control flow paths. The complexity of exploring multiple control flow paths is greatly alleviated by heuristically pruning infeasible control flow paths using the algorithm that we present. The proposed coverage metric provides high accuracy in designs that contain complex control flow. The results obtained are promising.
2007 IEEE International High Level Design Validation and Test Workshop, 2007
Functional coverage models which measure the sufficiency of test stimuli are essential to the ver... more Functional coverage models which measure the sufficiency of test stimuli are essential to the verification process. A key source of difficulty in their deployment emanates from the manual and imprecise nature of their development process and the lack of a sound measure of ...
2007 IEEE International High Level Design Validation and Test Workshop, 2007
... Ian G. Harris and Shireesh Verma Department of Computer Science University of California Irvi... more ... Ian G. Harris and Shireesh Verma Department of Computer Science University of California Irvine Irvine, CA 92697 {kiran, harris, shireesh ... The earliest CFG coverage metrics include statement coverage, branch coverage and path coverage [2] mod-els used in software testing. ...
2008 IEEE International High Level Design Validation and Test Workshop, 2008
... Metric Kiran Ramineni kiran@alumni.uci.edu Shireesh Verma shireesh@ics.uci.edu Ian G. Harris ... more ... Metric Kiran Ramineni kiran@alumni.uci.edu Shireesh Verma shireesh@ics.uci.edu Ian G. Harris harris@ics.uci.edu ... In dataflow testing, each vari-able occurrence is classified as either a definition occurrence or a use occurrence. ...
Journal of Computer Science, 2009
Problem statement: Verification of correct functionality of semiconductor devices has been a chal... more Problem statement: Verification of correct functionality of semiconductor devices has been a challenging problem. Given the device fabrication cost, it is critical to verify the expected functionality using simulations of executable device models before a device manufactured. However, typical industrial scale devices today involve large number of interactions between their components. Complexity of verifying all interactions becomes almost intractable even in simulation. The infeasible interactions need to be eliminated from verification consideration in order to reduce the complexity of the problem. Also an empirical metric of completeness of the verification of such interactions is needed. This metric should provide measure of quality of verification as well as that of degree of confidence in future correct behavior of the device. Metric should guide stimulus generation for simulation so that all aspects of the device functionality can be covered in verification. Existing coverage metrics focus almost exclusively on verification of individual components. Approach: In this study, interactions between device components modeled as independent processes, were considered. The interactions considered between control flow paths in different processes. Present algorithm analyzed the dependency between the control flow paths. It was also determined set of feasible interactions between the control flow paths and pruned out the infeasible ones. Remaining set of feasible interactions constituted our interaction coverage metric. Our metric handled device designs with an arbitrary number of processes. Results: Number of interactions to be considered in simulationbased verification was significantly reduced by our coverage metric using our proposed algorithms. This limited the complexity and scope of stimulus generation to coverage of only set of feasible interactions. Conclusion: Proposed coverage metric was able to provide realistic measure of degree of verification of components interactions as well as effectively guide the test generation process for device designs consisting of an arbitrary number of components.
2008 IEEE International High Level Design Validation and Test Workshop, 2008
The ever rising complexity of current hardware designs and the huge cost penalty of delivering a ... more The ever rising complexity of current hardware designs and the huge cost penalty of delivering a faulty product have led to a growing investment in functional verification and in the development of new technologies and methodologies in this area. The traditional HDL based testbenches are not proving sufficient for verification. Verification teams are switching to languages such as C++, SystemC
Proceedings -Design, Automation and Test in Europe, DATE, 2007
Transaction-level modeling (TLM) has been proposed as the leading strategy to address the always ... more Transaction-level modeling (TLM) has been proposed as the leading strategy to address the always increasing complexity of digital systems. However, its introduction arouses a new challenge for designers and verification engineers, since there are no mature tools to automatically synthesize an RTL implementation from a transaction-level (TL) design, thus manual refinements are mandatory. In this context, the paper presents an incremental assertionbased verification (ABV) methodology to check the correctness of the TL-to-RTL refinement. The methodology relies on reusing assertions and already checked code, and it is guided by an assertion coverage metrics.
2006 IEEE International High Level Design Validation and Test Workshop, 2006
Design simulation and model checking are two alternative and complementary techniques for verifyi... more Design simulation and model checking are two alternative and complementary techniques for verifying hardware designs. This paper presents a comparison between the two techniques based on detection of design errors, performance, and memory use. We perform error detection experiments using model checking and simulation to detect errors injected into a verification benchmark suite. The results allow a quantitative comparison of
Design, Automation, and Test in Europe, Apr 16, 2007
Ol 02 03 04 05 06 07 08 09 10 11 12 coverage_group example { sample_event = @ (posedge CLOCK); sa... more Ol 02 03 04 05 06 07 08 09 10 11 12 coverage_group example { sample_event = @ (posedge CLOCK); sample a, wr-en, QblPi, Qb,P2\ cross func_cov (a, wr_era, QblPi, Qb,j>2]{ state cvg_l ((a == l)&&(u>r_en == l)&&(g¡,jPi == 1)) state cvg_2 ((a == l)&&(wr_era == 0)&8¿(qb]p2 == ...
With the widespread use of PDA’s for communicating purposes it has become essentially important t... more With the widespread use of PDA’s for communicating purposes it has become essentially important to address this issue. One of the means of communication is using multicasting framework for sending messages etc. to a group of people. In this report we highlight the approach taken by us for implementing a multicasting framework for a real time operating system eCos and also some simulation results. We also illustrate the analysis done for improving the multicasting in wireless networks by the use of mobility prediction.
2009 10th International Symposium on Quality of Electronic Design, 2009
Abstract We are at the crossroads of some fundamental changes that are taking place in the semico... more Abstract We are at the crossroads of some fundamental changes that are taking place in the semiconductor industry. Power is a primary design criterion for bulk of the semiconductor designs now and a key reason behind the shift towards multi-core designs as increase in power ...
Design simulation and model checking are two alternative and complementary techniques for verifyi... more Design simulation and model checking are two alternative and complementary techniques for verifying hardware designs. This paper presents a comparison between the two techniques based on their detection of design errors, performance, and memory use. While memory and CPU performance gains in simulation-based validation over model checking verification methods are to be expected, the ability of simulation-based validation in detecting errors is comparable to that of model checking. We perform error detection experiments using model checking and simulation to detect errors injected into a verification benchmark suite. The results allow a quantitative comparison of simulation and model checking which can be used to understand weaknesses of both approaches. We see that simulation-based validation is effective, and where it is not, we define test generation goals which make it effective.
This report presents a methodology towards a complete software implementation of G.729 Annexe E o... more This report presents a methodology towards a complete software implementation of G.729 Annexe E on ARM926EJ-S processor running at 200 MHz clock speed. The methodology spans both, target independent as well as target dependent optimizations. The main focus is towards code speed up to reduce the per frame execution time as much below the 10 ms constraint as possible. Although no constraint is given for code size, the outcome of the optimization process has so far resulted in an overall reduction of the code size. The code was also run on an Aptix FPGA board to calibrate the ARMulator (ARM instruction set simulator). Although the optimizations presented here, were performed and analyzed on the G.729E
In course of system-level design, designers need an efficient system level design language (SLDL)... more In course of system-level design, designers need an efficient system level design language (SLDL), which can serve as the design vehicle. The complexity of the design process at this level is determined upto an extent, by the semantics and syntax definition of the SLDL being used. This report first analyzes the system-level design flow in order to establish the requirements on an SLDL. It then compares SpecC and SystemC, the two popular SLDLs, in terms of the extent they meet these requirements. Finally, it provides the essential modeling guidelines for both the SpecC and SystemC users for the cases where the constructs or the features of the respective languages give rise to an ambiguous design.
In course of system-level design, designers need an efficient system level design language (SLDL)... more In course of system-level design, designers need an efficient system level design language (SLDL), which can serve as the design vehicle. The complexity of the design process at this level is determined upto an extent, by the semantics and syntax definition of the SLDL being used. This report first analyzes the system-level design flow in order to establish the requirements on an SLDL. It then compares SpecC and SystemC, the two popular SLDLs, in terms of the extent they meet these requirements. Finally, it provides the essential modeling guidelines for both the SpecC and SystemC users for the cases where the constructs or the features of the respective languages give rise to an ambiguous design.
Journal of Low Power Electronics, 2011
ABSTRACT
Journal of Low Power Electronics, 2011
Practical Design Verification, 2009
2008 Ninth International Workshop on Microprocessor Test and Verification, 2008
Abstract Power consumption has become one of the most important differentiating factors for semi... more Abstract Power consumption has become one of the most important differentiating factors for semiconductor products. Voltage is the strongest handle for managing chip power consumption. We look in detail at some of key power management techniques such as ...
Journal of Computer Science, 2009
Problem statement: Dynamic verification, the use of simulation to determine design correctness, i... more Problem statement: Dynamic verification, the use of simulation to determine design correctness, is widely used due to its tractability for large hardware designs. A serious limitation of dynamic techniques is the difficulty in determining whether or not a test sequence is sufficient to detect all likely design errors. Coverage metrics are used to address this problem by providing a set of goals to be achieved during the simulation process; if all coverage goals are satisfied then the test sequence is assumed to be complete. Coverage metrics hence evaluate the ability of a test sequence to detect design errors and are essential to the verification process. A key source of difficulty in determining error detection is that the control-flow path traversed in the presence of an error cannot be determined. This problem becomes particularly difficult in case of typical industrial designs involving interaction of control flow paths between concurrent processes. Error detection can only be accurately determined by exploring the set of all control-flow paths, which may be traversed as a result of an error. Also, there is no technique to identify a correlation between coverage metrics and hardware design quality. Approach: We present a coverage metric that determined the propagation of error effects along all possible erroneous control-flow paths across processes. The complexity of exploring multiple controlflow paths was greatly alleviated by heuristically pruning infeasible control-flow paths using the algorithm that we present. We also presented a technique to evaluate coverage metric by examining its ability to ensure the detection of real design errors. We injected errors in the design to correlate their detection with the coverage computed by our metric. Results: Our coverage metric although analyzed all control-flow paths it pruned the infeasible ones and eliminated them from coverage consideration, hence reducing the complexity of generating tests meant to execute them. The metric also correlated better with detection of design errors than some well-studied metrics do. Conclusion: The proposed coverage metric provided high accuracy in measurement of coverage in designs that contain complex control-flow with concurrent processes. It is superior at detecting design error when compared with the metrics it was compared with.
Proceedings of the ASP-DAC 2005. Asia and South Pacific Design Automation Conference, 2005., 2005
Coverage metrics, which evaluate the ability of a test sequence to detect design faults, are esse... more Coverage metrics, which evaluate the ability of a test sequence to detect design faults, are essential to the validation process. A key source of difficulty in determining fault detection is that the control flow path traversed in the presence of a fault cannot be determined. Fault detection can only be accurately determined by exploring the set of all control flow paths, which may be traversed as a result of a fault. We present a coverage metric that determines the propagation of fault effects along all possible faulty control flow paths. The complexity of exploring multiple control flow paths is greatly alleviated by heuristically pruning infeasible control flow paths using the algorithm that we present. The proposed coverage metric provides high accuracy in designs that contain complex control flow. The results obtained are promising.
2007 IEEE International High Level Design Validation and Test Workshop, 2007
Functional coverage models which measure the sufficiency of test stimuli are essential to the ver... more Functional coverage models which measure the sufficiency of test stimuli are essential to the verification process. A key source of difficulty in their deployment emanates from the manual and imprecise nature of their development process and the lack of a sound measure of ...
2007 IEEE International High Level Design Validation and Test Workshop, 2007
... Ian G. Harris and Shireesh Verma Department of Computer Science University of California Irvi... more ... Ian G. Harris and Shireesh Verma Department of Computer Science University of California Irvine Irvine, CA 92697 {kiran, harris, shireesh ... The earliest CFG coverage metrics include statement coverage, branch coverage and path coverage [2] mod-els used in software testing. ...
2008 IEEE International High Level Design Validation and Test Workshop, 2008
... Metric Kiran Ramineni kiran@alumni.uci.edu Shireesh Verma shireesh@ics.uci.edu Ian G. Harris ... more ... Metric Kiran Ramineni kiran@alumni.uci.edu Shireesh Verma shireesh@ics.uci.edu Ian G. Harris harris@ics.uci.edu ... In dataflow testing, each vari-able occurrence is classified as either a definition occurrence or a use occurrence. ...
Journal of Computer Science, 2009
Problem statement: Verification of correct functionality of semiconductor devices has been a chal... more Problem statement: Verification of correct functionality of semiconductor devices has been a challenging problem. Given the device fabrication cost, it is critical to verify the expected functionality using simulations of executable device models before a device manufactured. However, typical industrial scale devices today involve large number of interactions between their components. Complexity of verifying all interactions becomes almost intractable even in simulation. The infeasible interactions need to be eliminated from verification consideration in order to reduce the complexity of the problem. Also an empirical metric of completeness of the verification of such interactions is needed. This metric should provide measure of quality of verification as well as that of degree of confidence in future correct behavior of the device. Metric should guide stimulus generation for simulation so that all aspects of the device functionality can be covered in verification. Existing coverage metrics focus almost exclusively on verification of individual components. Approach: In this study, interactions between device components modeled as independent processes, were considered. The interactions considered between control flow paths in different processes. Present algorithm analyzed the dependency between the control flow paths. It was also determined set of feasible interactions between the control flow paths and pruned out the infeasible ones. Remaining set of feasible interactions constituted our interaction coverage metric. Our metric handled device designs with an arbitrary number of processes. Results: Number of interactions to be considered in simulationbased verification was significantly reduced by our coverage metric using our proposed algorithms. This limited the complexity and scope of stimulus generation to coverage of only set of feasible interactions. Conclusion: Proposed coverage metric was able to provide realistic measure of degree of verification of components interactions as well as effectively guide the test generation process for device designs consisting of an arbitrary number of components.
2008 IEEE International High Level Design Validation and Test Workshop, 2008
The ever rising complexity of current hardware designs and the huge cost penalty of delivering a ... more The ever rising complexity of current hardware designs and the huge cost penalty of delivering a faulty product have led to a growing investment in functional verification and in the development of new technologies and methodologies in this area. The traditional HDL based testbenches are not proving sufficient for verification. Verification teams are switching to languages such as C++, SystemC
Proceedings -Design, Automation and Test in Europe, DATE, 2007
Transaction-level modeling (TLM) has been proposed as the leading strategy to address the always ... more Transaction-level modeling (TLM) has been proposed as the leading strategy to address the always increasing complexity of digital systems. However, its introduction arouses a new challenge for designers and verification engineers, since there are no mature tools to automatically synthesize an RTL implementation from a transaction-level (TL) design, thus manual refinements are mandatory. In this context, the paper presents an incremental assertionbased verification (ABV) methodology to check the correctness of the TL-to-RTL refinement. The methodology relies on reusing assertions and already checked code, and it is guided by an assertion coverage metrics.
2006 IEEE International High Level Design Validation and Test Workshop, 2006
Design simulation and model checking are two alternative and complementary techniques for verifyi... more Design simulation and model checking are two alternative and complementary techniques for verifying hardware designs. This paper presents a comparison between the two techniques based on detection of design errors, performance, and memory use. We perform error detection experiments using model checking and simulation to detect errors injected into a verification benchmark suite. The results allow a quantitative comparison of