Paolo Bonzini - [PATCH] provide sbitmap_calloc (original) (raw)
This is the mail archive of the gcc-patches@gcc.gnu.orgmailing list for the GCC project.
Index Nav: | [Date Index] [Subject Index] [Author Index] [Thread Index] | |
---|---|---|
Message Nav: | [Date Prev] [Date Next] | [Thread Prev] [Thread Next] |
Other format: | [Raw text] |
- From: Paolo Bonzini
- To: GCC Patches
- Date: Thu, 22 Feb 2007 14:12:55 +0100
- Subject: [PATCH] provide sbitmap_calloc
This could also give a small speed improvement (though I doubt it).Bootstrapped/regtested i686-pc-linux-gnu, ok?Paolo
2007-02-22 Paolo Bonzini bonzini@gnu.org
* sbitmap.c (sbitmap_calloc): New.
* sbitmap.h (sbitmap_calloc): Declare it.
* calls.c (expand_call): Use it.
* cfganal.c (mark_dfs_back_edges, post_order_compute,
inverted_post_order_compute, pre_and_rev_post_order_compute,
flow_dfs_compute_reverse_init, dfs_enumerate_from): Likewise.
* cfglayout.c (break_superblocks): Likewise.
* cfgloop.c (flow_loops_find, verify_loop_structure): Likewise.
* cfgloopmanip.c (fix_bb_placements, remove_path, loopify): Likewise.
* cfgrtl.c (commit_edge_insertions): Likewise.
* ddg.c (create_ddg, longest_simple_path): Likewise.
* except.c (eh_region_outermost): Likewise.
* gcse.c (compute_pre_data, pre_gcse,
remove_reachable_equiv_notes): Likewise.
* loop-unroll.c (unroll_loop_runtime_iterations, peel_loop_simple,
unroll_loop_stupid): Likewise.
* lower-subreg.c (decompose_multiword_subregs): Likewise.
* modulo-sched.c (get_sched_window, sms_schedule_by_order,
order_nodes_of_sccs, order_nodes_in_scc): Likewise.
* recog.c (split_all_insns): Likewise.
* regrename.c (copyprop_hardreg_forward): Likewise.
* sched-rgn.c (find_rgns): Likewise.
* tree-into-ssa.c (rewrite_into_ssa, init_update_ssa, update_ssa): Likewise.
* tree-ssa-alias.c (init_alias_info,
* tree-ssa-copy.c (dump_copy_of): Likewise.
* tree-ssa-dce.c (tree_dce_init, perform_tree_ssa_dce): Likewise.
* tree-ssa-dse.c (get_aggregate_vardecl): Likewise.
* tree-ssa-live.c (live_worklist): Likewise.
* tree-ssa-loop-im.c (fill_always_executed_in): Likewise.
* tree-ssa-loop-ivopts.c (multiplier_allowed_in_address_p): Likewise.
* tree-ssa-phiopt.c (blocks_in_phiopt_order): Likewise.
* tree-ssa-pre.c (compute_antic): Likewise.
* tree-ssa-propagate.c (ssa_prop_init,
* tree-ssa-structalias.c (build_pred_graph, init_topo_info,
init_scc_info, solve_graph): Likewise.
* tree-stdarg.c (reachable_at_most_once,
* tree-vrp.c (insert_range_assertions): Likewise.
* var-tracking.c (vt_find_locations): Likewise.
Index: tree-vrp.c
--- tree-vrp.c (revision 122189)
+++ tree-vrp.c (working copy)
@@ -3446,11 +3446,8 @@ insert_range_assertions (void)
edge_iterator ei;
bool update_ssa_p;
- found_in_subgraph = sbitmap_alloc (num_ssa_names);
- sbitmap_zero (found_in_subgraph);
- blocks_visited = sbitmap_alloc (last_basic_block);
- sbitmap_zero (blocks_visited);
found_in_subgraph = sbitmap_calloc (num_ssa_names);
blocks_visited = sbitmap_calloc (last_basic_block);
need_assert_for = BITMAP_ALLOC (NULL); asserts_for = XCNEWVEC (assert_locus_t, num_ssa_names);
Index: regrename.c
--- regrename.c (revision 122189) +++ regrename.c (working copy) @@ -1825,8 +1825,7 @@ copyprop_hardreg_forward (void) all_vd = XNEWVEC (struct value_data, last_basic_block); - visited = sbitmap_alloc (last_basic_block); - sbitmap_zero (visited); + visited = sbitmap_calloc (last_basic_block); FOR_EACH_BB (bb) { Index: tree-into-ssa.c
--- tree-into-ssa.c (revision 122189) +++ tree-into-ssa.c (working copy) @@ -2264,8 +2264,7 @@ rewrite_into_ssa (void) /* Initialize the set of interesting blocks. The callback mark_def_sites will add to this set those blocks that the renamer should process. / - interesting_blocks = sbitmap_alloc (last_basic_block); - sbitmap_zero (interesting_blocks); + interesting_blocks = sbitmap_calloc (last_basic_block); / Initialize dominance frontier. / dfs = XNEWVEC (bitmap, last_basic_block); @@ -2661,11 +2660,8 @@ init_update_ssa (void) / Reserve more space than the current number of names. The calls to add_new_name_mapping are typically done after creating new SSA names, so we'll need to reallocate these arrays. */ - old_ssa_names = sbitmap_alloc (num_ssa_names + NAME_SETS_GROWTH_FACTOR); - sbitmap_zero (old_ssa_names);
- new_ssa_names = sbitmap_alloc (num_ssa_names + NAME_SETS_GROWTH_FACTOR);
- sbitmap_zero (new_ssa_names);
old_ssa_names = sbitmap_calloc (num_ssa_names + NAME_SETS_GROWTH_FACTOR);
new_ssa_names = sbitmap_calloc (num_ssa_names + NAME_SETS_GROWTH_FACTOR);
repl_tbl = htab_create (20, repl_map_hash, repl_map_eq, repl_map_free); need_to_initialize_update_ssa_p = false;
@@ -3318,8 +3314,7 @@ update_ssa (unsigned update_flags) set_current_def (referenced_var (i), NULL_TREE); /* Now start the renaming process at START_BB. */ - tmp = sbitmap_alloc (last_basic_block); - sbitmap_zero (tmp); + tmp = sbitmap_calloc (last_basic_block); EXECUTE_IF_SET_IN_BITMAP (blocks_to_update, 0, i, bi) SET_BIT (tmp, i); Index: tree-ssa-loop-im.c
--- tree-ssa-loop-im.c (revision 122189) +++ tree-ssa-loop-im.c (working copy) @@ -1461,12 +1461,11 @@ fill_always_executed_in (struct loop *lo static void tree_ssa_lim_initialize (void) { - sbitmap contains_call = sbitmap_alloc (last_basic_block); + sbitmap contains_call = sbitmap_calloc (last_basic_block); block_stmt_iterator bsi; struct loop *loop; basic_block bb; - sbitmap_zero (contains_call); FOR_EACH_BB (bb) { for (bsi = bsi_start (bb); !bsi_end_p (bsi); bsi_next (&bsi)) Index: sbitmap.c
--- sbitmap.c (revision 122189) +++ sbitmap.c (working copy) @@ -86,6 +86,24 @@ sbitmap_alloc (unsigned int n_elms) return bmap; } +/* Allocate a simple bitmap of N_ELMS bits. / + +sbitmap +sbitmap_calloc (unsigned int n_elms) +{ + unsigned int bytes, size, amt; + sbitmap bmap; + + size = SBITMAP_SET_SIZE (n_elms); + bytes = size * sizeof (SBITMAP_ELT_TYPE); + amt = (sizeof (struct simple_bitmap_def) + + bytes - sizeof (SBITMAP_ELT_TYPE)); + bmap = xcalloc (amt, 1); + bmap->n_bits = n_elms; + bmap->size = size; + return bmap; +} + / Allocate a simple bitmap of N_ELMS bits, and a popcount array. */ sbitmap Index: sbitmap.h
--- sbitmap.h (revision 122189) +++ sbitmap.h (working copy) @@ -210,6 +210,7 @@ extern void dump_sbitmap_file (FILE *, s extern void dump_sbitmap_vector (FILE *, const char *, const char *, sbitmap *, int); extern sbitmap sbitmap_alloc (unsigned int); +extern sbitmap sbitmap_calloc (unsigned int); extern sbitmap sbitmap_alloc_with_popcount (unsigned int); extern sbitmap *sbitmap_vector_alloc (unsigned int, unsigned int); extern sbitmap sbitmap_resize (sbitmap, unsigned int, int); Index: cfgloopmanip.c
--- cfgloopmanip.c (revision 122189) +++ cfgloopmanip.c (working copy) @@ -189,8 +189,7 @@ fix_bb_placements (basic_block from, if (base_loop == current_loops->tree_root) return; - in_queue = sbitmap_alloc (last_basic_block); - sbitmap_zero (in_queue); + in_queue = sbitmap_calloc (last_basic_block); SET_BIT (in_queue, from->index); /* Prevent us from going out of the base_loop. / SET_BIT (in_queue, base_loop->header->index); @@ -315,8 +314,7 @@ remove_path (edge e) n_bord_bbs = 0; bord_bbs = XCNEWVEC (basic_block, n_basic_blocks); - seen = sbitmap_alloc (last_basic_block); - sbitmap_zero (seen); + seen = sbitmap_calloc (last_basic_block); / Find "border" hexes -- i.e. those with predecessor in removed path. */ for (i = 0; i < nrem; i++) @@ -531,8 +529,7 @@ loopify (edge latch_edge, edge header_ed /* Update dominators of blocks outside of LOOP. */ dom_bbs = XCNEWVEC (basic_block, n_basic_blocks); n_dom_bbs = 0; - seen = sbitmap_alloc (last_basic_block); - sbitmap_zero (seen); + seen = sbitmap_calloc (last_basic_block); body = get_loop_body (loop); for (i = 0; i < loop->num_nodes; i++) Index: ddg.c
--- ddg.c (revision 122189) +++ ddg.c (working copy) @@ -504,10 +504,8 @@ create_ddg (basic_block bb, int closing_ } g->nodes[i].cuid = i; - g->nodes[i].successors = sbitmap_alloc (num_nodes); - sbitmap_zero (g->nodes[i].successors); - g->nodes[i].predecessors = sbitmap_alloc (num_nodes); - sbitmap_zero (g->nodes[i].predecessors); + g->nodes[i].successors = sbitmap_calloc (num_nodes); + g->nodes[i].predecessors = sbitmap_calloc (num_nodes); g->nodes[i].first_note = (first_note ? first_note : insn); g->nodes[i++].insn = insn; first_note = NULL_RTX; @@ -1031,8 +1029,7 @@ longest_simple_path (struct ddg * g, int int result; int num_nodes = g->num_nodes; sbitmap workset = sbitmap_alloc (num_nodes); - sbitmap tmp = sbitmap_alloc (num_nodes);
sbitmap tmp = sbitmap_calloc (num_nodes);
/* Data will hold the distance of the longest path found so far from src to each node. Initialize to -1 = less than minimum. */
@@ -1040,9 +1037,7 @@ longest_simple_path (struct ddg * g, int g->nodes[i].aux.count = -1; g->nodes[src].aux.count = 0; - sbitmap_zero (tmp); SET_BIT (tmp, src);
while (change) { sbitmap_iterator sbi; Index: tree-ssa-dse.c
--- tree-ssa-dse.c (revision 122189) +++ tree-ssa-dse.c (working copy) @@ -388,8 +388,7 @@ get_aggregate_vardecl (tree decl, struct abort (); av_p->ignore = true; - av_p->parts_set = sbitmap_alloc (HOST_BITS_PER_LONG); - sbitmap_zero (av_p->parts_set); + av_p->parts_set = sbitmap_calloc (HOST_BITS_PER_LONG); *slot = av_p; } else Index: tree-ssa-loop-ivopts.c
--- tree-ssa-loop-ivopts.c (revision 122189) +++ tree-ssa-loop-ivopts.c (working copy) @@ -2821,8 +2821,7 @@ multiplier_allowed_in_address_p (HOST_WI rtx addr; HOST_WIDE_INT i; - valid_mult[mode] = sbitmap_alloc (2 * MAX_RATIO + 1); - sbitmap_zero (valid_mult[mode]); + valid_mult[mode] = sbitmap_calloc (2 * MAX_RATIO + 1); addr = gen_rtx_fmt_ee (MULT, Pmode, reg1, NULL_RTX); for (i = -MAX_RATIO; i <= MAX_RATIO; i++) { Index: modulo-sched.c
--- modulo-sched.c (revision 122189) +++ modulo-sched.c (working copy) @@ -1315,16 +1315,14 @@ get_sched_window (partial_schedule_ptr p ddg_edge_ptr e; int u = nodes_order [i]; ddg_node_ptr u_node = &ps->g->nodes[u]; - sbitmap psp = sbitmap_alloc (ps->g->num_nodes); - sbitmap pss = sbitmap_alloc (ps->g->num_nodes); + sbitmap psp = sbitmap_calloc (ps->g->num_nodes); + sbitmap pss = sbitmap_calloc (ps->g->num_nodes); sbitmap u_node_preds = NODE_PREDECESSORS (u_node); sbitmap u_node_succs = NODE_SUCCESSORS (u_node); int psp_not_empty; int pss_not_empty; /* 1. compute sched window for u (start, end, step). / - sbitmap_zero (psp); - sbitmap_zero (pss); psp_not_empty = sbitmap_a_and_b_cg (psp, u_node_preds, sched_nodes); pss_not_empty = sbitmap_a_and_b_cg (pss, u_node_succs, sched_nodes); @@ -1441,7 +1439,7 @@ sms_schedule_by_order (ddg_ptr g, int mi int num_nodes = g->num_nodes; ddg_edge_ptr e; int start, end, step; / Place together into one struct? */ - sbitmap sched_nodes = sbitmap_alloc (num_nodes); + sbitmap sched_nodes = sbitmap_calloc (num_nodes); sbitmap must_precede = sbitmap_alloc (num_nodes); sbitmap must_follow = sbitmap_alloc (num_nodes); sbitmap tobe_scheduled = sbitmap_alloc (num_nodes); @@ -1449,7 +1447,6 @@ sms_schedule_by_order (ddg_ptr g, int mi partial_schedule_ptr ps = create_partial_schedule (ii, g, DFA_HISTORY); sbitmap_ones (tobe_scheduled); - sbitmap_zero (sched_nodes); while ((! sbitmap_equal (tobe_scheduled, sched_nodes) || try_again_with_larger_ii ) && ii < maxii) @@ -1613,9 +1610,7 @@ static void check_nodes_order (int *node_order, int num_nodes) { int i; - sbitmap tmp = sbitmap_alloc (num_nodes);
- sbitmap_zero (tmp);
sbitmap tmp = sbitmap_calloc (num_nodes);
for (i = 0; i < num_nodes; i++) {
@@ -1667,12 +1662,11 @@ order_nodes_of_sccs (ddg_all_sccs_ptr al int i, pos = 0; ddg_ptr g = all_sccs->ddg; int num_nodes = g->num_nodes; - sbitmap prev_sccs = sbitmap_alloc (num_nodes); + sbitmap prev_sccs = sbitmap_calloc (num_nodes); sbitmap on_path = sbitmap_alloc (num_nodes); sbitmap tmp = sbitmap_alloc (num_nodes); sbitmap ones = sbitmap_alloc (num_nodes); - sbitmap_zero (prev_sccs); sbitmap_ones (ones); /* Perfrom the node ordering starting from the SCC with the highest recMII. @@ -1856,18 +1850,14 @@ order_nodes_in_scc (ddg_ptr g, sbitmap n enum sms_direction dir; int num_nodes = g->num_nodes; sbitmap workset = sbitmap_alloc (num_nodes); - sbitmap tmp = sbitmap_alloc (num_nodes); - sbitmap zero_bitmap = sbitmap_alloc (num_nodes); - sbitmap predecessors = sbitmap_alloc (num_nodes); - sbitmap successors = sbitmap_alloc (num_nodes); + sbitmap tmp = sbitmap_calloc (num_nodes); + sbitmap zero_bitmap = sbitmap_calloc (num_nodes); + sbitmap predecessors = sbitmap_calloc (num_nodes); + sbitmap successors = sbitmap_calloc (num_nodes); - sbitmap_zero (predecessors); find_predecessors (predecessors, g, nodes_ordered);
sbitmap_zero (successors); find_successors (successors, g, nodes_ordered);
sbitmap_zero (tmp); if (sbitmap_a_and_b_cg (tmp, predecessors, scc)) { sbitmap_copy (workset, tmp);
@@ -1888,7 +1878,6 @@ order_nodes_in_scc (ddg_ptr g, sbitmap n dir = BOTTOMUP; } - sbitmap_zero (zero_bitmap); while (!sbitmap_equal (workset, zero_bitmap)) { int v; Index: tree-stdarg.c
--- tree-stdarg.c (revision 122189) +++ tree-stdarg.c (working copy) @@ -62,8 +62,7 @@ reachable_at_most_once (basic_block va_a stack = XNEWVEC (edge, n_basic_blocks + 1); sp = 0; - visited = sbitmap_alloc (last_basic_block); - sbitmap_zero (visited); + visited = sbitmap_calloc (last_basic_block); ret = true; FOR_EACH_EDGE (e, ei, va_arg_bb->preds) Index: tree-ssa-propagate.c
--- tree-ssa-propagate.c (revision 122189) +++ tree-ssa-propagate.c (working copy) @@ -469,11 +469,8 @@ ssa_prop_init (void) interesting_ssa_edges = VEC_alloc (tree, gc, 20); varying_ssa_edges = VEC_alloc (tree, gc, 20); - executable_blocks = sbitmap_alloc (last_basic_block); - sbitmap_zero (executable_blocks);
- bb_in_list = sbitmap_alloc (last_basic_block);
- sbitmap_zero (bb_in_list);
executable_blocks = sbitmap_calloc (last_basic_block);
bb_in_list = sbitmap_calloc (last_basic_block);
if (dump_file && (dump_flags & TDF_DETAILS)) dump_immediate_uses (dump_file);
Index: tree-ssa-alias.c
--- tree-ssa-alias.c (revision 122189) +++ tree-ssa-alias.c (working copy) @@ -1108,8 +1108,7 @@ init_alias_info (void) tree var; ai = XCNEW (struct alias_info); - ai->ssa_names_visited = sbitmap_alloc (num_ssa_names); - sbitmap_zero (ai->ssa_names_visited); + ai->ssa_names_visited = sbitmap_calloc (num_ssa_names); ai->processed_ptrs = VEC_alloc (tree, heap, 50); ai->written_vars = pointer_set_create (); ai->dereferenced_ptrs_store = pointer_set_create (); Index: cfganal.c
--- cfganal.c (revision 122189) +++ cfganal.c (working copy) @@ -175,10 +175,7 @@ mark_dfs_back_edges (void) sp = 0; /* Allocate bitmap to track nodes that have been visited. */ - visited = sbitmap_alloc (last_basic_block);
- /* None of the nodes in the CFG have been visited yet. */
- sbitmap_zero (visited);
visited = sbitmap_calloc (last_basic_block);
/* Push the first edge on to the stack. */ stack[sp++] = ei_start (ENTRY_BLOCK_PTR->succs);
@@ -668,10 +665,7 @@ post_order_compute (int post_order, boo sp = 0; / Allocate bitmap to track nodes that have been visited. */ - visited = sbitmap_alloc (last_basic_block);
- /* None of the nodes in the CFG have been visited yet. */
- sbitmap_zero (visited);
visited = sbitmap_calloc (last_basic_block);
/* Push the first edge on to the stack. */ stack[sp++] = ei_start (ENTRY_BLOCK_PTR->succs);
@@ -766,8 +760,7 @@ post_order_compute (int *post_order, boo static basic_block dfs_find_deadend (basic_block bb) { - sbitmap visited = sbitmap_alloc (last_basic_block); - sbitmap_zero (visited); + sbitmap visited = sbitmap_calloc (last_basic_block); for (;;) { @@ -821,10 +814,7 @@ inverted_post_order_compute (int post_o sp = 0; / Allocate bitmap to track nodes that have been visited. */ - visited = sbitmap_alloc (last_basic_block);
- /* None of the nodes in the CFG have been visited yet. */
- sbitmap_zero (visited);
visited = sbitmap_calloc (last_basic_block);
/* Put all blocks that have no successor into the initial work list. */ FOR_BB_BETWEEN (bb, ENTRY_BLOCK_PTR, NULL, next_bb)
@@ -970,10 +960,7 @@ pre_and_rev_post_order_compute (int pre rev_post_order_num -= NUM_FIXED_BLOCKS; / Allocate bitmap to track nodes that have been visited. */ - visited = sbitmap_alloc (last_basic_block);
- /* None of the nodes in the CFG have been visited yet. */
- sbitmap_zero (visited);
visited = sbitmap_calloc (last_basic_block);
/* Push the first edge on to the stack. */ stack[sp++] = ei_start (ENTRY_BLOCK_PTR->succs);
@@ -1083,10 +1070,7 @@ flow_dfs_compute_reverse_init (depth_fir data->sp = 0; /* Allocate bitmap to track nodes that have been visited. */ - data->visited_blocks = sbitmap_alloc (last_basic_block);
- /* None of the nodes in the CFG have been visited yet. */
- sbitmap_zero (data->visited_blocks);
data->visited_blocks = sbitmap_calloc (last_basic_block);
return;
} @@ -1174,9 +1158,7 @@ dfs_enumerate_from (basic_block bb, int if (!visited) {
visited = sbitmap_alloc (size);
sbitmap_zero (visited);
visited = sbitmap_calloc (size); v_size = size;
--- recog.c (revision 122189) +++ recog.c (working copy) @@ -2600,8 +2600,7 @@ split_all_insns (void) bool changed; basic_block bb; } else if (v_size < size) Index: recog.c
- blocks = sbitmap_alloc (last_basic_block);
- sbitmap_zero (blocks);
blocks = sbitmap_calloc (last_basic_block); changed = false;
FOR_EACH_BB_REVERSE (bb)
Index: gcse.c
--- gcse.c (revision 122189) +++ gcse.c (working copy) @@ -3833,8 +3833,7 @@ compute_pre_data (void) sbitmap_vector_zero (ae_kill, last_basic_block); /* Collect expressions which might trap. / - trapping_expr = sbitmap_alloc (expr_hash_table.n_elems); - sbitmap_zero (trapping_expr); + trapping_expr = sbitmap_calloc (expr_hash_table.n_elems); for (ui = 0; ui < expr_hash_table.size; ui++) { struct expr *e; @@ -4513,8 +4512,7 @@ pre_gcse (void) index_map[expr->bitmap_index] = expr; / Reset bitmap used to track which insns are redundant. / - pre_redundant_insns = sbitmap_alloc (max_cuid); - sbitmap_zero (pre_redundant_insns); + pre_redundant_insns = sbitmap_calloc (max_cuid); / Delete the redundant insns first so that - we know what register to use for the new insns and for the other @@ -6245,7 +6243,7 @@ remove_reachable_equiv_notes (basic_bloc edge_iterator *stack, ei; int sp; edge act; - sbitmap visited = sbitmap_alloc (last_basic_block); + sbitmap visited = sbitmap_calloc (last_basic_block); rtx last, insn, note; rtx mem = smexpr->pattern; @@ -6253,8 +6251,6 @@ remove_reachable_equiv_notes (basic_bloc sp = 0; ei = ei_start (bb->succs); - sbitmap_zero (visited);
act = (EDGE_COUNT (ei_container (ei)) > 0 ? EDGE_I (ei_container (ei), 0) : NULL); while (1) { Index: loop-unroll.c
--- loop-unroll.c (revision 122189)
+++ loop-unroll.c (working copy)
@@ -998,13 +998,12 @@ unroll_loop_runtime_iterations (struct l
remove_edges = NULL;
- wont_exit = sbitmap_alloc (max_unroll + 2);
+ wont_exit = sbitmap_calloc (max_unroll + 2);
/* Peel the first copy of loop body (almost always we must leave exit test
here; the only exception is when we have extra zero check and the number
of iterations is reliable. Also record the place of (possible) extra
zero check. */
- sbitmap_zero (wont_exit);
if (extra_zero_check
&& !desc->noloop_assumptions)
SET_BIT (wont_exit, 1);
@@ -1258,8 +1257,7 @@ peel_loop_simple (struct loop *loop)
if (flag_split_ivs_in_unroller && npeel > 1)
opt_info = analyze_insns_in_loop (loop);
- wont_exit = sbitmap_alloc (npeel + 1);
- sbitmap_zero (wont_exit);
+ wont_exit = sbitmap_calloc (npeel + 1);
opt_info_start_duplication (opt_info);
@@ -1408,9 +1406,7 @@ unroll_loop_stupid (struct loop *loop)
|| flag_variable_expansion_in_unroller)
opt_info = analyze_insns_in_loop (loop);
- wont_exit = sbitmap_alloc (nunroll + 1);
- sbitmap_zero (wont_exit);
wont_exit = sbitmap_calloc (nunroll + 1); opt_info_start_duplication (opt_info);
ok = duplicate_loop_to_header_edge (loop, loop_latch_edge (loop),
Index: tree-ssa-phiopt.c
--- tree-ssa-phiopt.c (revision 122189) +++ tree-ssa-phiopt.c (working copy) @@ -254,13 +254,11 @@ blocks_in_phiopt_order (void) basic_block *order = XNEWVEC (basic_block, n_basic_blocks); unsigned n = n_basic_blocks - NUM_FIXED_BLOCKS; unsigned np, i; - sbitmap visited = sbitmap_alloc (last_basic_block); + sbitmap visited = sbitmap_calloc (last_basic_block); #define MARK_VISITED(BB) (SET_BIT (visited, (BB)->index)) #define VISITED_P(BB) (TEST_BIT (visited, (BB)->index)) - sbitmap_zero (visited);
MARK_VISITED (ENTRY_BLOCK_PTR); FOR_EACH_BB (x) { Index: calls.c
--- calls.c (revision 122189) +++ calls.c (working copy) @@ -2329,8 +2329,7 @@ expand_call (tree exp, rtx target, int i #else = plus_constant (argblock, -current_function_pretend_args_size); #endif - stored_args_map = sbitmap_alloc (args_size.constant); - sbitmap_zero (stored_args_map); + stored_args_map = sbitmap_calloc (args_size.constant); } /* If we have no actual push instructions, or shouldn't use them, Index: lower-subreg.c
--- lower-subreg.c (revision 122195) +++ lower-subreg.c (working copy) @@ -1021,8 +1021,7 @@ decompose_multiword_subregs (void) propagate_pseudo_copies (); no_new_pseudos = 0; - blocks = sbitmap_alloc (last_basic_block); - sbitmap_zero (blocks); + blocks = sbitmap_calloc (last_basic_block); EXECUTE_IF_SET_IN_BITMAP (decomposable_context, 0, regno, iter) decompose_register (regno); Index: except.c
--- except.c (revision 122189) +++ except.c (working copy) @@ -1097,9 +1097,7 @@ eh_region_outermost (struct function *if gcc_assert (rp_a != NULL); gcc_assert (rp_b != NULL); - b_outer = sbitmap_alloc (ifun->eh->last_region_number + 1); - sbitmap_zero (b_outer);
- b_outer = sbitmap_calloc (ifun->eh->last_region_number + 1); do { SET_BIT (b_outer, rp_b->region_number);
Index: tree-ssa-pre.c
--- tree-ssa-pre.c (revision 122189) +++ tree-ssa-pre.c (working copy) @@ -1913,9 +1913,7 @@ compute_antic (void) /* If any predecessor edges are abnormal, we punt, so antic_in is empty. We pre-build the map of blocks with incoming abnormal edges here. */ - has_abnormal_preds = sbitmap_alloc (last_basic_block); - sbitmap_zero (has_abnormal_preds);
- has_abnormal_preds = sbitmap_calloc (last_basic_block); FOR_EACH_BB (block) { edge_iterator ei;
Index: tree-ssa-live.c
--- tree-ssa-live.c (revision 122189) +++ tree-ssa-live.c (working copy) @@ -620,11 +620,9 @@ live_worklist (tree_live_info_p live) { unsigned b; basic_block bb; - sbitmap visited = sbitmap_alloc (last_basic_block + 1); + sbitmap visited = sbitmap_calloc (last_basic_block + 1); bitmap tmp = BITMAP_ALLOC (NULL); - sbitmap_zero (visited);
/* Visit all the blocks in reverse order and propogate live on entry values into the predecessors blocks. */ FOR_EACH_BB_REVERSE (bb) Index: tree-ssa-copy.c
--- tree-ssa-copy.c (revision 122189)
+++ tree-ssa-copy.c (working copy)
@@ -540,8 +540,7 @@ dump_copy_of (FILE *file, tree var)
if (TREE_CODE (var) != SSA_NAME)
return;
- visited = sbitmap_alloc (num_ssa_names);
- sbitmap_zero (visited);
+ visited = sbitmap_calloc (num_ssa_names);
SET_BIT (visited, SSA_NAME_VERSION (var));
fprintf (file, " copy-of chain: ");
Index: cfglayout.c
--- cfglayout.c (revision 122189) +++ cfglayout.c (working copy) @@ -1090,8 +1090,7 @@ break_superblocks (void) bool need = false; basic_block bb; - superblocks = sbitmap_alloc (last_basic_block); - sbitmap_zero (superblocks); + superblocks = sbitmap_calloc (last_basic_block); FOR_EACH_BB (bb) if (bb->flags & BB_SUPERBLOCK) Index: tree-ssa-dce.c
--- tree-ssa-dce.c (revision 122189) +++ tree-ssa-dce.c (working copy) @@ -760,13 +760,10 @@ tree_dce_init (bool aggressive) for (i = 0; i < last_basic_block; ++i) control_dependence_map[i] = BITMAP_ALLOC (NULL); - last_stmt_necessary = sbitmap_alloc (last_basic_block); - sbitmap_zero (last_stmt_necessary); + last_stmt_necessary = sbitmap_calloc (last_basic_block); } - processed = sbitmap_alloc (num_ssa_names + 1); - sbitmap_zero (processed);
- processed = sbitmap_calloc (num_ssa_names + 1); worklist = VEC_alloc (tree, heap, 64); cfg_altered = false; } @@ -824,9 +821,7 @@ perform_tree_ssa_dce (bool aggressive) find_all_control_dependences (el); timevar_pop (TV_CONTROL_DEPENDENCES);
visited_control_parents = sbitmap_alloc (last_basic_block);
sbitmap_zero (visited_control_parents);
}visited_control_parents = sbitmap_calloc (last_basic_block); mark_dfs_back_edges ();
Index: var-tracking.c
--- var-tracking.c (revision 122189) +++ var-tracking.c (working copy) @@ -1820,9 +1820,8 @@ vt_find_locations (void) worklist = fibheap_new (); pending = fibheap_new (); visited = sbitmap_alloc (last_basic_block); - in_worklist = sbitmap_alloc (last_basic_block); + in_worklist = sbitmap_calloc (last_basic_block); in_pending = sbitmap_alloc (last_basic_block); - sbitmap_zero (in_worklist); FOR_EACH_BB (bb) fibheap_insert (pending, bb_order[bb->index], bb); Index: cfgloop.c
--- cfgloop.c (revision 122189) +++ cfgloop.c (working copy) @@ -366,8 +366,7 @@ flow_loops_find (struct loops loops) / Count the number of loop headers. This should be the same as the number of natural loops. / - headers = sbitmap_alloc (last_basic_block); - sbitmap_zero (headers); + headers = sbitmap_calloc (last_basic_block); num_loops = 0; FOR_EACH_BB (header) @@ -1361,7 +1360,7 @@ verify_loop_structure (void) if (current_loops->state & LOOPS_HAVE_MARKED_IRREDUCIBLE_REGIONS) { / Record old info. */ - irreds = sbitmap_alloc (last_basic_block); + irreds = sbitmap_calloc (last_basic_block); FOR_EACH_BB (bb) { edge_iterator ei; Index: sched-rgn.c
--- sched-rgn.c (revision 122189) +++ sched-rgn.c (working copy) @@ -531,14 +531,9 @@ find_rgns (void) inner = sbitmap_alloc (last_basic_block); sbitmap_ones (inner); - header = sbitmap_alloc (last_basic_block); - sbitmap_zero (header);
- in_queue = sbitmap_alloc (last_basic_block);
- sbitmap_zero (in_queue);
- in_stack = sbitmap_alloc (last_basic_block);
- sbitmap_zero (in_stack);
header = sbitmap_calloc (last_basic_block);
in_queue = sbitmap_calloc (last_basic_block);
in_stack = sbitmap_calloc (last_basic_block);
for (i = 0; i < last_basic_block; i++) max_hdr[i] = -1;
@@ -682,8 +677,7 @@ find_rgns (void) if (extend_regions_p) { degree1 = xmalloc (last_basic_block * sizeof (int)); - extended_rgn_header = sbitmap_alloc (last_basic_block); - sbitmap_zero (extended_rgn_header); + extended_rgn_header = sbitmap_calloc (last_basic_block); } /* Find blocks which are inner loop headers. We still have non-reducible Index: tree-ssa-structalias.c
--- tree-ssa-structalias.c (revision 122189) +++ tree-ssa-structalias.c (working copy) @@ -946,8 +946,7 @@ build_pred_graph (void) graph->eq_rep = XNEWVEC (int, graph->size); graph->complex = XCNEWVEC (VEC(constraint_t, heap) , VEC_length (varinfo_t, varmap)); - graph->direct_nodes = sbitmap_alloc (graph->size); - sbitmap_zero (graph->direct_nodes); + graph->direct_nodes = sbitmap_calloc (graph->size); for (j = 0; j < FIRST_REF_NODE; j++) { @@ -1258,8 +1257,7 @@ init_topo_info (void) { size_t size = VEC_length (varinfo_t, varmap); struct topo_info *ti = XNEW (struct topo_info); - ti->visited = sbitmap_alloc (size); - sbitmap_zero (ti->visited); + ti->visited = sbitmap_calloc (size); ti->topo_order = VEC_alloc (unsigned, heap, 1); return ti; } @@ -1556,10 +1554,8 @@ init_scc_info (size_t size) size_t i; si->current_index = 0; - si->visited = sbitmap_alloc (size); - sbitmap_zero (si->visited); - si->roots = sbitmap_alloc (size); - sbitmap_zero (si->roots); + si->visited = sbitmap_calloc (size); + si->roots = sbitmap_calloc (size); si->node_mapping = XNEWVEC (unsigned int, size); si->dfs = XCNEWVEC (unsigned int, size); @@ -2013,8 +2009,7 @@ solve_graph (constraint_graph_t graph) bitmap pts; changed_count = 0; - changed = sbitmap_alloc (size); - sbitmap_zero (changed); + changed = sbitmap_calloc (size); / Mark all initial non-collapsed nodes as changed. */ for (i = 0; i < size; i++) Index: cfgrtl.c
--- cfgrtl.c (revision 122189) +++ cfgrtl.c (working copy) @@ -1484,8 +1484,7 @@ commit_edge_insertions (void) if (current_ir_type () == IR_RTL_CFGLAYOUT) return;
- blocks = sbitmap_alloc (last_basic_block);
- sbitmap_zero (blocks);
- blocks = sbitmap_calloc (last_basic_block); FOR_EACH_BB (bb) if (bb->aux) {
- Follow-Ups:
- Re: [PATCH] provide sbitmap_calloc
* From: Richard Henderson
- Re: [PATCH] provide sbitmap_calloc
Index Nav: | [Date Index] [Subject Index] [Author Index] [Thread Index] | |
---|---|---|
Message Nav: | [Date Prev] [Date Next] | [Thread Prev] [Thread Next] |