Piotr Zając - Academia.edu (original) (raw)
Papers by Piotr Zając
Materials Science Forum, 2005
Magnetorheological (MR) fluid dampers are semi-active control devices that have been applied over... more Magnetorheological (MR) fluid dampers are semi-active control devices that have been applied over a wide range of practical vibration control applications. This paper concerns the experimental identification of the dynamic behaviour of an MR damper and the use of the identified parameters in the control of such a damper. Feed-forward and recurrent neural networks are used to model both the
Smart Materials and Structures, 2010
This paper presents a procedure for manufacturing composites, a methodology for testing them and ... more This paper presents a procedure for manufacturing composites, a methodology for testing them and the cyclic properties of isotropic magnetorheological elastomers. The choice of a thermoplastic matrix and magnetically active iron powder as the filling (much larger than the carbonyl iron powder filling used so far) is expounded. A manufacturing technology has been developed. Possibilities for the experimental investigation of
Journal of Physics: Conference Series, 2009
An original magnetorheological composite (MRC) with porous elastomeric matrix and filled with mag... more An original magnetorheological composite (MRC) with porous elastomeric matrix and filled with magnetorheological fluid was created at the work. It was used later on to build a damper working in the shearing mode without friction against external surfaces of the so-called skid. This prototype construction was used for damping free vibrations in the beam. An analysis of the effectiveness in
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
Simulation step is nowadays one of the crucial in the design process of the device. It allows est... more Simulation step is nowadays one of the crucial in the design process of the device. It allows estimating the device performance and reducing the cost of the fabrication process. It is also very useful in statistical approach. As technological process is not ideal and always produce some deviation from established parameters, the final device has variable performance. In this paper, the statistical approach using Monte Carlo modelling is presented in order to estimate the stiffness of a rectangular membrane of various length to width ratio.
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
The bonding is one of the common processes performed during MEMS fabrication. It allows creating ... more The bonding is one of the common processes performed during MEMS fabrication. It allows creating advanced microstructures and specific encapsulation. However, this process induces a residual stress into the structure as the annealing process is required. In this paper the bonding of silicon and pyrex glass is taken into account as these materials have different coefficients of thermal expansion. We analyze the bonding process performed in optimal temperature that induces residual stress as small as possible. The simulations show the influence of residual stress on membrane deflection in typical temperature range of operation.
22nd IEEE International Symposium on Defect and Fault-Tolerance in VLSI Systems (DFT 2007), 2007
We address two problems in this work, namely, 1) the resilience challenge in the future nanochips... more We address two problems in this work, namely, 1) the resilience challenge in the future nanochips made up of massively defective nanoelements and organized in replicative multicore architectures and 2) the issue of preserving the production yield. The main suggestion that we extensively study is that the chip should be selfconfiguring. We mean that the chip should execute, with almost no external testing or control mechanisms, core mutual-test to isolate the defective cores, and self-configuration of communications to discover the routes in the defective network. Our contribution is a systematic study of the dependence of the production yield versus the node failure probability (possibly as high as 0.4), in several networks with different node connectivity ranging from 3 to 5. The result is obtained in terms of a probabilistic metrics to warrant that at least a minimal fraction of nodes can be contacted by the input-output port for participating to the processing in the defective network.
13th IEEE International On-Line Testing Symposium (IOLTS 2007), 2007
We describe a self-configuration methodology to tolerate defective nodes in chips organized in ma... more We describe a self-configuration methodology to tolerate defective nodes in chips organized in massively replicative architectures as those shown below in , made up of hundreds of cores in a highly defective technology. Note that the keyword of this presentation is not configuration but self-configuration. The basic idea is that chips will become so complex that it will be unrealistic to consider diagnosing all nodes and all routes with some external equipment. Contrarily, chips should become autonomous and adaptative to preserve their resilience, and as little external interventions as possible should be involved to control the start up phase and the subsequent operation. By self-configuration we mean self-diagnosis of cores through mutual tests (both at startup and possibly at runtime), self-configuration of communications, self-shutdown of the cores which cannot take part to the processing, and ultimately, adaptative task allocation and redundant execution at runtime to cope with transient faults.
2008 14th IEEE International On-Line Testing Symposium, 2008
The downsizing of transistor dimensions enabled in the future nanotechnologies will inevitably in... more The downsizing of transistor dimensions enabled in the future nanotechnologies will inevitably increase the number of faults in the complex ULSI chips. To maintain the production yield at acceptable level, several levels of protection mechanisms will have to be implemented to tolerate the permanent and transient faults occurring in the physical layers. In this paper, we study fault tolerance at the architectural level in multiport processor grids (MPG) through core dual diagnosis and self-configuration of communications. MPGs are considered to ensure the scalability of future hundred-core chips. We characterize defective technologies by the IOP reachability (i.e., the ability of the IOPs to contact a fraction of cores in the grid) that we study as a function of the fraction of defective cores or links. We show that almost all valid cores in the grid are accessible by all input-output ports (IOP) up to approximately 20-25% of defective cores. This property is quasiindependent of the position of the IOPs in the grid.
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
A variety of thermal models has been proposed to predict the temperatures inside modern processor... more A variety of thermal models has been proposed to predict the temperatures inside modern processors. In this paper, we describe and compare two such approaches, a detailed FEMbased simulation and a simpler architectural compact model. It is shown that both models provide comparable results when it comes to predicting the maximal temperature, however there are also non-negligible differences when estimating thermal gradients within a chip.
2012 28th Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2012
This paper presents the design of a test ASIC, which was intended for the investigation of therma... more This paper presents the design of a test ASIC, which was intended for the investigation of thermal coupling in Many-Core Architectures. Particular sections of the paper describe in detail the design concept and simulated operation of the ASIC which is currently sent for manufacturing.
2012 28th Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2012
The constant increase of subthreshold current of nanometer transistors due to technology scaling ... more The constant increase of subthreshold current of nanometer transistors due to technology scaling may hinder the evolution of high-performance chips in the near future. This evokes the need of accurate leakage power modeling for new nanometer technologies. In this paper, we present an improved subthreshold current model, which was integrated it into an architectural-level power simulator. Using this simulator, we
19th International Workshop on Thermal Investigations of ICs and Systems (THERMINIC), 2013
One of the interesting thermal management techniques for multi-core processors is core swapping. ... more One of the interesting thermal management techniques for multi-core processors is core swapping. In this paper, using the published power data and floorplans for two modern high-performance processors, we employ the well-known HotSpot tool to perform the thermal simulation of the core swapping mechanism. Our transient simulations show that by using core swapping technique, it was possible to either minimize the hot spot temperature in the Ivy Bridge chip by 5ºC or increase the operating frequency by 17% and maintain the same temperature as in the case without core swapping. We also derive an analytical model of the activity migration mechanism between two cores which may serve as a tool to calculate the swapping frequency given the desired maximal temperature drop. The model also allows for the correlation of the cooling effectiveness with the performance penalty induced by the swapping.
Facta universitatis - series: Electronics and Energetics, 2014
Several MEMS (Micro Electro-Mechanical Systems) devices have been analysed and simulated. The new... more Several MEMS (Micro Electro-Mechanical Systems) devices have been analysed and simulated. The new proposed model of SiC MPS (Merged PIN-Schottky) diodes is in full agreement with the real MPS devices. The real size DLL (Dynamic Lattice Liquid) simulator as well as the research on modelling and simulation of modern VLSI devices with practical applications have been presented. Based on experience in the field of ATCA (Advanced Telecommunications Computing Architecture) based systems a proof-of-concept DAQ (data acquisition) system for ITER (International Thermonuclear Experimental Reactor) have been proposed.
2014 15th International Conference on Thermal, Mechanical and Mulit-Physics Simulation and Experiments in Microelectronics and Microsystems (EuroSimE), 2014
ABSTRACT In this paper we present a tool which incorporates an analytical model of a microbolomet... more ABSTRACT In this paper we present a tool which incorporates an analytical model of a microbolometer. Within the tool the user can freely change the input parameters such as dimensions and material properties and immediately obtain output parameters such as responsivity, thermal time constant etc. Moreover, the tool can be used to compute the transient thermal response of the microbolometer for a given radiation power and bias current. The model was validated against the results obtained from ANSYS for several different devices and the maximal relative error in transient temperature response was found to be only 3%.
IEEE Transactions on Dependable and Secure Computing, 2000
We study chip self-organization and fault tolerance at the architectural level to improve dependa... more We study chip self-organization and fault tolerance at the architectural level to improve dependable continuous operation of multicore arrays in massively defective nanotechnologies. Architectural self-organization results from the conjunction of self-diagnosis and self-disconnection mechanisms (to identify and isolate most permanently faulty or inaccessible cores and routers), plus self-discovery of routes to maintain the communication in the array. In the methodology presented in this work, chip self-diagnosis is performed in 3 steps, following an ascending order of complexity: interconnects are tested first, then routers through mutual test, and cores in the last step. The mutual testing of routers is especially important as faulty routers are disconnected by good ones with no assumption on the behavior of defective elements. Moreover, the disconnection of faulty routers is not physical ("hard") but logical ("soft") in that a good router simply stops communicating with any adjacent router diagnosed as defective. There is no physical reconfiguration in the chip and no need for spare elements. Ultimately, the multicore array may be viewed as a black box, which incorporates protection mechanisms and self-organizes while the external control reduces to a simple chip validation test which, in the simplest cases, reduces to counting the number of valid and accessible cores.
Materials Science Forum, 2005
Magnetorheological (MR) fluid dampers are semi-active control devices that have been applied over... more Magnetorheological (MR) fluid dampers are semi-active control devices that have been applied over a wide range of practical vibration control applications. This paper concerns the experimental identification of the dynamic behaviour of an MR damper and the use of the identified parameters in the control of such a damper. Feed-forward and recurrent neural networks are used to model both the
Smart Materials and Structures, 2010
This paper presents a procedure for manufacturing composites, a methodology for testing them and ... more This paper presents a procedure for manufacturing composites, a methodology for testing them and the cyclic properties of isotropic magnetorheological elastomers. The choice of a thermoplastic matrix and magnetically active iron powder as the filling (much larger than the carbonyl iron powder filling used so far) is expounded. A manufacturing technology has been developed. Possibilities for the experimental investigation of
Journal of Physics: Conference Series, 2009
An original magnetorheological composite (MRC) with porous elastomeric matrix and filled with mag... more An original magnetorheological composite (MRC) with porous elastomeric matrix and filled with magnetorheological fluid was created at the work. It was used later on to build a damper working in the shearing mode without friction against external surfaces of the so-called skid. This prototype construction was used for damping free vibrations in the beam. An analysis of the effectiveness in
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
Simulation step is nowadays one of the crucial in the design process of the device. It allows est... more Simulation step is nowadays one of the crucial in the design process of the device. It allows estimating the device performance and reducing the cost of the fabrication process. It is also very useful in statistical approach. As technological process is not ideal and always produce some deviation from established parameters, the final device has variable performance. In this paper, the statistical approach using Monte Carlo modelling is presented in order to estimate the stiffness of a rectangular membrane of various length to width ratio.
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
The bonding is one of the common processes performed during MEMS fabrication. It allows creating ... more The bonding is one of the common processes performed during MEMS fabrication. It allows creating advanced microstructures and specific encapsulation. However, this process induces a residual stress into the structure as the annealing process is required. In this paper the bonding of silicon and pyrex glass is taken into account as these materials have different coefficients of thermal expansion. We analyze the bonding process performed in optimal temperature that induces residual stress as small as possible. The simulations show the influence of residual stress on membrane deflection in typical temperature range of operation.
22nd IEEE International Symposium on Defect and Fault-Tolerance in VLSI Systems (DFT 2007), 2007
We address two problems in this work, namely, 1) the resilience challenge in the future nanochips... more We address two problems in this work, namely, 1) the resilience challenge in the future nanochips made up of massively defective nanoelements and organized in replicative multicore architectures and 2) the issue of preserving the production yield. The main suggestion that we extensively study is that the chip should be selfconfiguring. We mean that the chip should execute, with almost no external testing or control mechanisms, core mutual-test to isolate the defective cores, and self-configuration of communications to discover the routes in the defective network. Our contribution is a systematic study of the dependence of the production yield versus the node failure probability (possibly as high as 0.4), in several networks with different node connectivity ranging from 3 to 5. The result is obtained in terms of a probabilistic metrics to warrant that at least a minimal fraction of nodes can be contacted by the input-output port for participating to the processing in the defective network.
13th IEEE International On-Line Testing Symposium (IOLTS 2007), 2007
We describe a self-configuration methodology to tolerate defective nodes in chips organized in ma... more We describe a self-configuration methodology to tolerate defective nodes in chips organized in massively replicative architectures as those shown below in , made up of hundreds of cores in a highly defective technology. Note that the keyword of this presentation is not configuration but self-configuration. The basic idea is that chips will become so complex that it will be unrealistic to consider diagnosing all nodes and all routes with some external equipment. Contrarily, chips should become autonomous and adaptative to preserve their resilience, and as little external interventions as possible should be involved to control the start up phase and the subsequent operation. By self-configuration we mean self-diagnosis of cores through mutual tests (both at startup and possibly at runtime), self-configuration of communications, self-shutdown of the cores which cannot take part to the processing, and ultimately, adaptative task allocation and redundant execution at runtime to cope with transient faults.
2008 14th IEEE International On-Line Testing Symposium, 2008
The downsizing of transistor dimensions enabled in the future nanotechnologies will inevitably in... more The downsizing of transistor dimensions enabled in the future nanotechnologies will inevitably increase the number of faults in the complex ULSI chips. To maintain the production yield at acceptable level, several levels of protection mechanisms will have to be implemented to tolerate the permanent and transient faults occurring in the physical layers. In this paper, we study fault tolerance at the architectural level in multiport processor grids (MPG) through core dual diagnosis and self-configuration of communications. MPGs are considered to ensure the scalability of future hundred-core chips. We characterize defective technologies by the IOP reachability (i.e., the ability of the IOPs to contact a fraction of cores in the grid) that we study as a function of the fraction of defective cores or links. We show that almost all valid cores in the grid are accessible by all input-output ports (IOP) up to approximately 20-25% of defective cores. This property is quasiindependent of the position of the IOPs in the grid.
2014 Proceedings of the 21st International Conference Mixed Design of Integrated Circuits and Systems (MIXDES), 2014
A variety of thermal models has been proposed to predict the temperatures inside modern processor... more A variety of thermal models has been proposed to predict the temperatures inside modern processors. In this paper, we describe and compare two such approaches, a detailed FEMbased simulation and a simpler architectural compact model. It is shown that both models provide comparable results when it comes to predicting the maximal temperature, however there are also non-negligible differences when estimating thermal gradients within a chip.
2012 28th Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2012
This paper presents the design of a test ASIC, which was intended for the investigation of therma... more This paper presents the design of a test ASIC, which was intended for the investigation of thermal coupling in Many-Core Architectures. Particular sections of the paper describe in detail the design concept and simulated operation of the ASIC which is currently sent for manufacturing.
2012 28th Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2012
The constant increase of subthreshold current of nanometer transistors due to technology scaling ... more The constant increase of subthreshold current of nanometer transistors due to technology scaling may hinder the evolution of high-performance chips in the near future. This evokes the need of accurate leakage power modeling for new nanometer technologies. In this paper, we present an improved subthreshold current model, which was integrated it into an architectural-level power simulator. Using this simulator, we
19th International Workshop on Thermal Investigations of ICs and Systems (THERMINIC), 2013
One of the interesting thermal management techniques for multi-core processors is core swapping. ... more One of the interesting thermal management techniques for multi-core processors is core swapping. In this paper, using the published power data and floorplans for two modern high-performance processors, we employ the well-known HotSpot tool to perform the thermal simulation of the core swapping mechanism. Our transient simulations show that by using core swapping technique, it was possible to either minimize the hot spot temperature in the Ivy Bridge chip by 5ºC or increase the operating frequency by 17% and maintain the same temperature as in the case without core swapping. We also derive an analytical model of the activity migration mechanism between two cores which may serve as a tool to calculate the swapping frequency given the desired maximal temperature drop. The model also allows for the correlation of the cooling effectiveness with the performance penalty induced by the swapping.
Facta universitatis - series: Electronics and Energetics, 2014
Several MEMS (Micro Electro-Mechanical Systems) devices have been analysed and simulated. The new... more Several MEMS (Micro Electro-Mechanical Systems) devices have been analysed and simulated. The new proposed model of SiC MPS (Merged PIN-Schottky) diodes is in full agreement with the real MPS devices. The real size DLL (Dynamic Lattice Liquid) simulator as well as the research on modelling and simulation of modern VLSI devices with practical applications have been presented. Based on experience in the field of ATCA (Advanced Telecommunications Computing Architecture) based systems a proof-of-concept DAQ (data acquisition) system for ITER (International Thermonuclear Experimental Reactor) have been proposed.
2014 15th International Conference on Thermal, Mechanical and Mulit-Physics Simulation and Experiments in Microelectronics and Microsystems (EuroSimE), 2014
ABSTRACT In this paper we present a tool which incorporates an analytical model of a microbolomet... more ABSTRACT In this paper we present a tool which incorporates an analytical model of a microbolometer. Within the tool the user can freely change the input parameters such as dimensions and material properties and immediately obtain output parameters such as responsivity, thermal time constant etc. Moreover, the tool can be used to compute the transient thermal response of the microbolometer for a given radiation power and bias current. The model was validated against the results obtained from ANSYS for several different devices and the maximal relative error in transient temperature response was found to be only 3%.
IEEE Transactions on Dependable and Secure Computing, 2000
We study chip self-organization and fault tolerance at the architectural level to improve dependa... more We study chip self-organization and fault tolerance at the architectural level to improve dependable continuous operation of multicore arrays in massively defective nanotechnologies. Architectural self-organization results from the conjunction of self-diagnosis and self-disconnection mechanisms (to identify and isolate most permanently faulty or inaccessible cores and routers), plus self-discovery of routes to maintain the communication in the array. In the methodology presented in this work, chip self-diagnosis is performed in 3 steps, following an ascending order of complexity: interconnects are tested first, then routers through mutual test, and cores in the last step. The mutual testing of routers is especially important as faulty routers are disconnected by good ones with no assumption on the behavior of defective elements. Moreover, the disconnection of faulty routers is not physical ("hard") but logical ("soft") in that a good router simply stops communicating with any adjacent router diagnosed as defective. There is no physical reconfiguration in the chip and no need for spare elements. Ultimately, the multicore array may be viewed as a black box, which incorporates protection mechanisms and self-organizes while the external control reduces to a simple chip validation test which, in the simplest cases, reduces to counting the number of valid and accessible cores.