GCC Middle and Back End API Reference
|
Data Structures | |
struct | name_to_bb |
struct | ssa_names_hasher |
Variables | |
static unsigned int | nt_call_phase |
static struct pointer_set_t * | nontrap_set |
static hash_table < ssa_names_hasher > | seen_ssa_names |
|
static |
The function absolute_replacement does the main work of doing the absolute replacement. Return true if the replacement is done. Otherwise return false. bb is the basic block where the replacement is going to be done on. arg0 is argument 0 from the phi. Likewise for arg1.
References edge_def::dest, duplicate_ssa_name(), extract_true_false_edges_from_block(), gimple_assign_lhs(), gimple_assign_rhs1(), gimple_assign_rhs_code(), gimple_build_assign_with_ops(), gimple_cond_code(), gimple_cond_lhs(), gimple_cond_rhs(), gsi_insert_after(), gsi_insert_before(), gsi_last_bb(), GSI_NEW_STMT, integer_zerop(), last_and_only_stmt(), last_stmt(), make_ssa_name(), real_zerop(), and replace_phi_edge_with_variable().
Referenced by tree_ssa_phiopt_worker().
|
static |
We see the expression EXP in basic block BB. If it's an interesting expression (an MEM_REF through an SSA_NAME) possibly insert the expression into the set NONTRAP or the hash table of seen expressions. STORE is true if this expression is on the LHS, otherwise it's on the RHS.
References basic_block_def::aux, name_to_bb::bb, hash_table< Descriptor, Allocator >::find_slot(), host_integerp(), HOST_WIDE_INT, int_size_in_bytes(), nt_call_phase, name_to_bb::offset, name_to_bb::phase, pointer_set_insert(), name_to_bb::size, name_to_bb::ssa_name_ver, name_to_bb::store, and tree_low_cst().
Referenced by nt_init_block().
basic_block* blocks_in_phiopt_order | ( | void | ) |
Returns the list of basic blocks in the function in an order that guarantees that if a block X has just a single predecessor Y, then Y is after X in the ordering.
References bitmap_clear(), order, sbitmap_alloc(), sbitmap_free(), single_pred(), single_pred_p(), and visited.
Referenced by tree_ssa_ifcombine(), and tree_ssa_phiopt_worker().
|
static |
Conditional store replacement. We already know that the recognized pattern looks like so: split: if (cond) goto THEN_BB; else goto ELSE_BB (edge E1) THEN_BB: ... X = Y; ... goto JOIN_BB; ELSE_BB: ... X = Z; ... fallthrough (edge E0) JOIN_BB: some more We check that it is safe to sink the store to JOIN_BB by verifying that there are no read-after-write or write-after-write dependencies in THEN_BB and ELSE_BB.
References chrec_dont_know, chrec_known, compute_all_dependences(), cond_if_else_store_replacement_1(), DDR_A, DDR_ARE_DEPENDENT, DDR_B, DR_IS_READ, DR_IS_WRITE, DR_STMT, find_data_references_in_bb(), free_data_refs(), free_dependence_relations(), gimple_get_lhs(), gimple_uid(), last_and_only_stmt(), operand_equal_p(), renumber_gimple_stmt_uids_in_blocks(), and vNULL.
Referenced by tree_ssa_phiopt_worker().
|
static |
Do the main work of conditional store replacement.
References add_phi_arg(), create_phi_node(), get_base_address(), gimple_assign_lhs(), gimple_assign_rhs1(), gimple_assign_single_p(), gimple_clobber_p(), gimple_has_volatile_ops(), gimple_location(), gsi_after_labels(), gsi_end_p(), gsi_for_stmt(), gsi_insert_after(), gsi_insert_before(), gsi_last_bb(), GSI_NEW_STMT, gsi_remove(), is_gimple_reg_type(), make_temp_ssa_name(), operand_equal_p(), release_defs(), and unlink_stmt_vdef().
Referenced by cond_if_else_store_replacement().
|
static |
Do the main work of conditional store replacement. We already know that the recognized pattern looks like so: split: if (cond) goto MIDDLE_BB; else goto JOIN_BB (edge E1) MIDDLE_BB: something fallthrough (edge E0) JOIN_BB: some more We check that MIDDLE_BB contains only one store, that that store doesn't trap (not via NOTRAP, but via checking if an access to the same memory location dominates us) and that the store has a "simple" RHS.
References add_phi_arg(), create_phi_node(), gimple_assign_lhs(), gimple_assign_rhs1(), gimple_assign_single_p(), gimple_has_volatile_ops(), gimple_location(), gimple_set_location(), gsi_after_labels(), gsi_end_p(), gsi_for_stmt(), gsi_insert_after(), gsi_insert_before(), gsi_insert_on_edge(), gsi_last_bb(), GSI_NEW_STMT, gsi_remove(), is_gimple_reg_type(), last_and_only_stmt(), make_temp_ssa_name(), pointer_set_contains(), release_defs(), unlink_stmt_vdef(), and unshare_expr().
Referenced by tree_ssa_phiopt_worker().
|
static |
The function conditional_replacement does the main work of doing the conditional replacement. Return true if the replacement is done. Otherwise return false. BB is the basic block where the replacement is going to be done on. ARG0 is argument 0 from PHI. Likewise for ARG1.
References empty_block_p(), extract_true_false_edges_from_block(), fold_convert_loc(), force_gimple_operand_gsi(), gimple_build_assign_with_ops(), gimple_cond_code(), gimple_cond_lhs(), gimple_cond_rhs(), gimple_location(), gimple_phi_arg_location(), gimple_set_location(), gsi_for_stmt(), gsi_insert_before(), GSI_SAME_STMT, integer_all_onesp(), integer_onep(), integer_zerop(), last_stmt(), make_ssa_name(), replace_phi_edge_with_variable(), and useless_type_conversion_p().
Referenced by tree_ssa_phiopt_worker().
|
static |
|
static |
Determine whether we should attempt to hoist adjacent loads out of diamond patterns in pass_phiopt. Always hoist loads if -fhoist-adjacent-loads is specified and the target machine has both a conditional move instruction and a defined cache line size.
Referenced by tree_ssa_phiopt().
|
static |
Always do these optimizations if we have SSA trees to work on.
|
staticread |
This is the entry point of gathering non trapping memory accesses. It will do a dominator walk over the whole function, and it will make use of the bb->aux pointers. It returns a set of trees (the MEM_REFs itself) which can't trap.
References dom_walk_data::after_dom_children, dom_walk_data::before_dom_children, dom_walk_data::block_local_data_size, calculate_dominance_info(), CDI_DOMINATORS, clear_aux_for_blocks(), hash_table< Descriptor, Allocator >::create(), hash_table< Descriptor, Allocator >::dispose(), fini_walk_dominator_tree(), dom_walk_data::global_data, init_walk_dominator_tree(), dom_walk_data::initialize_block_local_data, nt_call_phase, nt_fini_block(), nt_init_block(), pointer_set_create(), and walk_dominator_tree().
Referenced by tree_ssa_phiopt_worker().
|
static |
Given a "diamond" control-flow pattern where BB0 tests a condition, BB1 and BB2 are "then" and "else" blocks dependent on this test, and BB3 rejoins control flow following BB1 and BB2, look for opportunities to hoist loads as follows. If BB3 contains a PHI of two loads, one each occurring in BB1 and BB2, and the loads are provably of adjacent fields in the same structure, then move both loads into BB0. Of course this can only be done if there are no dependencies preventing such motion. One of the hoisted loads will always be speculative, so the transformation is currently conservative: - The fields must be strictly adjacent. - The two fields must occupy a single memory block that is guaranteed to not cross a page boundary. The last is difficult to prove, as such memory blocks should be aligned on the minimum of the stack alignment boundary and the alignment guaranteed by heap allocation interfaces. Thus we rely on a parameter for the alignment value. Provided a good value is used for the last case, the first restriction could possibly be relaxed.
References bit_position(), dump_file, dump_flags, gimple_assign_rhs1(), gimple_assign_single_p(), gimple_bb(), gimple_has_volatile_ops(), gimple_phi_arg_def(), gimple_phi_num_args(), gimple_phi_result(), gsi_end_p(), gsi_for_stmt(), gsi_move_to_bb_end(), gsi_next(), gsi_start_phis(), gsi_stmt(), host_integerp(), basic_block_def::index, local_mem_dependence(), operand_equal_p(), optab_handler(), print_gimple_stmt(), and virtual_operand_p().
Referenced by tree_ssa_phiopt_worker().
|
static |
Update *ARG which is defined in STMT so that it contains the computed value if that seems profitable. Return true if the statement is made dead by that rewriting.
References double_int::from_shwi(), get_addr_base_and_unit_offset(), gimple_assign_rhs1(), gimple_assign_rhs_code(), HOST_WIDE_INT, mem_ref_offset(), and offset.
Referenced by value_replacement().
|
static |
Return TRUE if STMT has a VUSE whose corresponding VDEF is in BB.
References gimple_vuse().
Referenced by hoist_adjacent_loads().
gimple_opt_pass* make_pass_cselim | ( | ) |
gimple_opt_pass* make_pass_phiopt | ( | ) |
|
static |
The function minmax_replacement does the main work of doing the minmax replacement. Return true if the replacement is done. Otherwise return false. BB is the basic block where the replacement is going to be done on. ARG0 is argument 0 from the PHI. Likewise for ARG1.
References edge_def::dest, duplicate_ssa_name(), empty_block_p(), extract_true_false_edges_from_block(), gimple_assign_lhs(), gimple_assign_rhs1(), gimple_assign_rhs2(), gimple_assign_rhs_code(), gimple_build_assign_with_ops(), gimple_cond_code(), gimple_cond_lhs(), gimple_cond_rhs(), gsi_insert_before(), gsi_last_bb(), gsi_last_nondebug_bb(), gsi_move_before(), GSI_NEW_STMT, integer_nonzerop(), last_and_only_stmt(), last_stmt(), operand_equal_for_phi_arg_p(), replace_phi_edge_with_variable(), edge_def::src, and type().
Referenced by tree_ssa_phiopt_worker().
bool nonfreeing_call_p | ( | ) |
Return true when CALL is a call stmt that definitely doesn't free any memory or makes it unavailable otherwise.
References BUILT_IN_NORMAL, gimple_call_builtin_p(), gimple_call_flags(), and gimple_call_fndecl().
|
static |
Called by walk_dominator_tree, when basic block BB is exited.
References basic_block_def::aux.
Referenced by get_non_trapping().
|
static |
Called by walk_dominator_tree, when entering the block BB.
References add_or_mark_expr(), basic_block_def::aux, gimple_assign_lhs(), gimple_assign_rhs1(), gimple_assign_single_p(), gimple_has_volatile_ops(), gsi_end_p(), gsi_next(), gsi_start_bb(), gsi_stmt(), is_gimple_call(), nonfreeing_call_p(), nt_call_phase, basic_block_def::preds, and edge_def::src.
Referenced by get_non_trapping().
|
static |
Replace PHI node element whose edge is E in block BB with variable NEW. Remove the edge from COND_BLOCK which does not lead to BB (COND_BLOCK is known to have two edges, one of which must reach BB).
References delete_basic_block(), edge_def::dest_idx, dump_file, dump_flags, basic_block_def::flags, gimple_bb(), gsi_last_bb(), gsi_remove(), and basic_block_def::index.
Referenced by abs_replacement(), conditional_replacement(), minmax_replacement(), and value_replacement().
|
static |
Return the singleton PHI in the SEQ of PHIs for edges E0 and E1.
References edge_def::dest_idx, gimple_phi_arg_def(), gimple_seq_singleton_p(), gsi_end_p(), gsi_next(), gsi_stmt(), and operand_equal_for_phi_arg_p().
Referenced by tree_ssa_phiopt_worker(), and value_replacement().
|
static |
This pass tries to transform conditional stores into unconditional ones, enabling further simplifications with the simpler then and else blocks. In particular it replaces this: bb0: if (cond) goto bb2; else goto bb1; bb1: *p = RHS; bb2: with bb0: if (cond) goto bb1; else goto bb2; bb1: condtmp' = *p; bb2: condtmp = PHI <RHS, condtmp'> *p = condtmp; This transformation can only be done under several constraints, documented below. It also replaces: bb0: if (cond) goto bb2; else goto bb1; bb1: *p = RHS1; goto bb3; bb2: *p = RHS2; bb3: with bb0: if (cond) goto bb3; else goto bb1; bb1: bb3: condtmp = PHI <RHS1, RHS2> *p = condtmp;
References loop_optimizer_finalize(), loop_optimizer_init(), scev_finalize(), scev_initialize(), and tree_ssa_phiopt_worker().
|
static |
This pass tries to replaces an if-then-else block with an assignment. We have four kinds of transformations. Some of these transformations are also performed by the ifcvt RTL optimizer. Conditional Replacement ----------------------- This transformation, implemented in conditional_replacement, replaces bb0: if (cond) goto bb2; else goto bb1; bb1: bb2: x = PHI <0 (bb1), 1 (bb0), ...>; with bb0: x' = cond; goto bb2; bb2: x = PHI <x' (bb0), ...>; We remove bb1 as it becomes unreachable. This occurs often due to gimplification of conditionals. Value Replacement ----------------- This transformation, implemented in value_replacement, replaces bb0: if (a != b) goto bb2; else goto bb1; bb1: bb2: x = PHI <a (bb1), b (bb0), ...>; with bb0: bb2: x = PHI <b (bb0), ...>; This opportunity can sometimes occur as a result of other optimizations. ABS Replacement --------------- This transformation, implemented in abs_replacement, replaces bb0: if (a >= 0) goto bb2; else goto bb1; bb1: x = -a; bb2: x = PHI <x (bb1), a (bb0), ...>; with bb0: x' = ABS_EXPR< a >; bb2: x = PHI <x' (bb0), ...>; MIN/MAX Replacement ------------------- This transformation, minmax_replacement replaces bb0: if (a <= b) goto bb2; else goto bb1; bb1: bb2: x = PHI <b (bb1), a (bb0), ...>; with bb0: x' = MIN_EXPR (a, b) bb2: x = PHI <x' (bb0), ...>; A similar transformation is done for MAX_EXPR. This pass also performs a fifth transformation of a slightly different flavor. Adjacent Load Hoisting ---------------------- This transformation replaces bb0: if (...) goto bb2; else goto bb1; bb1: x1 = (<expr>).field1; goto bb3; bb2: x2 = (<expr>).field2; bb3: # x = PHI <x1, x2>; with bb0: x1 = (<expr>).field1; x2 = (<expr>).field2; if (...) goto bb2; else goto bb1; bb1: goto bb3; bb2: bb3: # x = PHI <x1, x2>; The purpose of this transformation is to enable generation of conditional move instructions such as Intel CMOVE or PowerPC ISEL. Because one of the loads is speculative, the transformation is restricted to very specific cases to avoid introducing a page fault. We are looking for the common idiom: if (...) x = y->left; else x = y->right; where left and right are typically adjacent pointers in a tree structure.
References gate_hoist_loads(), and tree_ssa_phiopt_worker().
|
static |
Referenced by tree_ssa_cs_elim(), and tree_ssa_phiopt().
|
static |
The core routine of conditional store replacement and normal phi optimizations. Both share much of the infrastructure in how to match applicable basic block patterns. DO_STORE_ELIM is true when we want to do conditional store replacement, false otherwise. DO_HOIST_LOADS is true when we want to hoist adjacent loads out of diamond control flow patterns, false otherwise.
References abs_replacement(), blocks_in_phiopt_order(), cond_if_else_store_replacement(), cond_store_replacement(), conditional_replacement(), edge_def::dest, edge_def::dest_idx, edge_def::flags, free(), get_non_trapping(), gimple_cond_lhs(), gimple_phi_arg_def(), gsi_commit_edge_inserts(), gsi_end_p(), gsi_next(), gsi_stmt(), hoist_adjacent_loads(), last_stmt(), minmax_replacement(), phi_nodes(), phis, pointer_set_destroy(), predictable_edge_p(), basic_block_def::preds, single_non_singleton_phi_for_edges(), single_pred(), single_pred_p(), single_succ_p(), basic_block_def::succs, and value_replacement().
|
static |
The function value_replacement does the main work of doing the value replacement. Return non-zero if the replacement is done. Otherwise return 0. If we remove the middle basic block, return 2. BB is the basic block where the replacement is going to be done on. ARG0 is argument 0 from the PHI. Likewise for ARG1.
References edge_def::dest, edge_def::dest_idx, dump_file, dump_flags, extract_true_false_edges_from_block(), gimple_assign_lhs(), gimple_cond_code(), gimple_cond_lhs(), gimple_cond_rhs(), gimple_phi_result(), gsi_after_labels(), gsi_end_p(), gsi_next_nondebug(), gsi_stmt(), basic_block_def::index, is_gimple_assign(), is_gimple_debug(), jump_function_from_stmt(), last_stmt(), operand_equal_for_phi_arg_p(), phi_nodes(), print_generic_expr(), replace_phi_edge_with_variable(), single_non_singleton_phi_for_edges(), and single_succ_edge().
Referenced by tree_ssa_phiopt_worker().
|
static |
The set of MEM_REFs which can't trap.
|
static |
Used for quick clearing of the hash-table when we see calls. Hash entries with phase < nt_call_phase are invalid.
Referenced by add_or_mark_expr(), get_non_trapping(), and nt_init_block().
|
static |
The hash table for remembering what we've seen.