GCC Middle and Back End API Reference
store-motion.c File Reference

Data Structures

struct  st_expr
struct  st_expr_hasher


static struct st_exprst_expr_entry ()
static void free_st_expr_entry ()
static void free_store_motion_mems ()
static int enumerate_store_motion_mems ()
static struct st_exprfirst_st_expr ()
static struct st_exprnext_st_expr ()
static void print_store_motion_mems ()
static bool store_ops_ok ()
static int extract_mentioned_regs_1 ()
static rtx extract_mentioned_regs ()
static bool load_kills_store ()
static bool find_loads ()
static bool store_killed_in_pat ()
static bool store_killed_in_insn ()
static bool store_killed_after (const_rtx x, const_rtx x_regs, const_rtx insn, const_basic_block bb, int *regs_set_after, rtx *fail_insn)
static bool store_killed_before (const_rtx x, const_rtx x_regs, const_rtx insn, const_basic_block bb, int *regs_set_before)
static void find_moveable_store ()
static int compute_store_table ()
static void insert_insn_start_basic_block ()
static int insert_store ()
static void remove_reachable_equiv_notes ()
static void replace_store_insn ()
static void delete_store ()
static void build_store_vectors ()
static void free_store_memory ()
static int one_store_motion_pass ()
static bool gate_rtl_store_motion ()
static unsigned int execute_rtl_store_motion ()
rtl_opt_passmake_pass_rtl_store_motion ()


static struct st_exprstore_motion_mems = NULL
static sbitmapst_kill
static sbitmapst_avloc
static sbitmapst_antloc
static sbitmapst_transp
static sbitmapst_insert_map
static sbitmapst_delete_map
static int num_stores
static struct edge_listedge_list
static hash_table< st_expr_hasherstore_motion_mems_table

Function Documentation

static void build_store_vectors ( )
   Fill in available, anticipatable, transparent and kill vectors in
   STORE_DATA, based on lists of available and anticipatable stores.  
     Build the gen_vector. This is any store in the table which is not killed
     by aliasing later in its block.  
             If we've already seen an available expression in this block,
             we can delete this one (It occurs earlier in the block). We'll
             copy the SRC expression to an unused register in case there
             are any side effects.  
                 It should not be necessary to consider the expression
                 killed if it is both anticipatable and available.  
static int compute_store_table ( )
   Find available and anticipatable stores.  
     Find all the stores we care about.  
         First compute the registers set in this block.  
         Now find the stores.  
             Now that we've marked regs, look for stores.  
             Unmark regs that are no longer set.  
         last_set_in should now be all-zero.  
         Clear temporary marks.  
     Remove the stores that are not available anywhere, as there will
     be no opportunity to optimize them.  

References st_expr::antic_stores.

static void delete_store ( )
   Delete a store, but copy the value that would have been stored into
   the reaching_reg for later storing.  
             We know there is only one since we deleted redundant
             ones during the available computation.  
static int enumerate_store_motion_mems ( )
   Assign each element of the list of mems a monotonically increasing value.  
static unsigned int execute_rtl_store_motion ( )
static rtx extract_mentioned_regs ( )
   Returns a list of registers mentioned in X.
   FIXME: A regset would be prettier and less expensive.  

Referenced by store_killed_before().

static int extract_mentioned_regs_1 ( )
   Helper for extract_mentioned_regs.  

Referenced by store_ops_ok().

static bool find_loads ( )
   Go through the entire rtx X, looking for any loads which might alias
   STORE_PATTERN.  Return true if found.
   AFTER is true if we are checking the case when STORE_PATTERN occurs
   after the insn X.  
     Recursively process the insn.  

References exp_equiv_p(), output_dependence(), and SET.

static void find_moveable_store ( )
   Determine whether INSN is MEM store pattern that we will consider moving.
   REGS_SET_BEFORE is bitmap of registers set before (and including) the
   current insn, REGS_SET_AFTER is bitmap of registers set after (and
   including) the insn in this basic block.  We must be passing through BB from
   head to end, as we are using this fact to speed things up.

   The results are stored this way:

   -- the first anticipatable expression is added into ANTIC_STORES
   -- if the processed expression is not anticipatable, NULL_RTX is added
      there instead, so that we can use it as indicator that no further
      expression of this type may be anticipatable
   -- if the expression is available, it is added as head of AVAIL_STORES;
      consequently, all of them but this head are dead and may be deleted.
   -- if the expression is not available, the insn due to that it fails to be
      available is stored in REACHING_REG (via LAST_AVAIL_CHECK_FAILURE).

   The things are complicated a bit by fact that there already may be stores
   to the same MEM from other blocks; also caller must take care of the
   necessary cleanup of the temporary markers after end of the basic block.
     If we are handling exceptions, we must be careful with memory references
     that may trap.  If we are not, the behavior is undefined, so we may just
     Even if the destination cannot trap, the source may.  In this case we'd
     need to handle updating the REG_EH_REGION note.  
     Make sure that the SET_SRC of this store insns can be assigned to
     a register, or we will fail later on in replace_store_insn, which
     assumes that we can do this.  But sometimes the target machine has
     oddities like MEM read-modify-write instruction.  See for example
     Do not check for anticipatability if we either found one anticipatable
     store already, or tested for one and found out that it was killed.  
     It is not necessary to check whether store is available if we did
     it successfully before; if we failed before, do not bother to check
     until we reach the insn that caused us to fail.  
         Check that we have already reached the insn at that the check
         failed last time.  

References st_expr::antic_stores.

static struct st_expr* first_st_expr ( )
   Return first item in the list.  
static void free_st_expr_entry ( )
   Free up an individual st_expr entry.  
static void free_store_memory ( )
   Free memory used by store motion.  
static void free_store_motion_mems ( )
   Free up all memory associated with the st_expr list.  
static bool gate_rtl_store_motion ( )
static void insert_insn_start_basic_block ( )
   In all code following after this, REACHING_REG has its original
   meaning again.  Avoid confusion, and undef the accessor macro for
   the temporary marks usage in compute_store_table.  
   Insert an instruction at the beginning of a basic block, and update
   the BB_HEAD if needed.  
     Insert at start of successor block.  

References bitmap_clear_bit(), edge_def::dest, st_expr::index, basic_block_def::preds, and edge_def::src.

static int insert_store ( )
   This routine will insert a store on an edge. EXPR is the st_expr entry for
   the memory reference, and E is the edge to insert it on.  Returns nonzero
   if an edge insertion was performed.  
     We did all the deleted before this insert, so if we didn't delete a
     store, then we haven't set the reaching reg yet either.  
     If we are inserting this expression on ALL predecessor edges of a BB,
     insert it at the start of the BB, and reset the insert bits on the other
     edges so we don't try to insert it on the other edges.  
     If tmp is NULL, we found an insertion on every edge, blank the
     insertion vector for these edges, and insert at the start of the BB.  
     We can't put stores in the front of blocks pointed to by abnormal
     edges since that may put a store where one didn't used to be.  

References st_expr::antic_stores, bitmap_bit_p(), bitmap_clear(), bitmap_set_bit(), edge_def::dest, dump_file, ei_container(), ei_edge(), ei_end_p(), ei_next(), exp_equiv_p(), find_reg_equal_equiv_note(), free(), st_expr::index, basic_block_def::index, last, st_expr::pattern, remove_note(), sbitmap_alloc(), sbitmap_free(), stack, basic_block_def::succs, and visited.

static bool load_kills_store ( )
   Check to see if the load X is aliased with STORE_PATTERN.
   AFTER is true if we are checking the case when STORE_PATTERN occurs
   after the X.  
rtl_opt_pass* make_pass_rtl_store_motion ( )
static struct st_expr* next_st_expr ( )
   Return the next item in the list after the specified one.  
static int one_store_motion_pass ( )
   Perform store motion. Much like gcse, except we move expressions the
   other way by looking at the flowgraph in reverse.
   Return non-zero if transformations are performed by the pass.  
     Find all the available and anticipatable stores.  
     Now compute kill & transp vectors.  
     Now we want to insert the new stores which are going to be needed.  
         If any of the edges we have above are abnormal, we can't move this
         Now we want to insert the new stores which are going to be needed.  
static void print_store_motion_mems ( )
   Dump debugging info about the store_motion_mems list.  
static void remove_reachable_equiv_notes ( )
   Remove any REG_EQUAL or REG_EQUIV notes containing a reference to the
   memory location in SMEXPR set in basic block BB.

   This could be rather expensive.  
static void replace_store_insn ( )
   This routine will replace a store with a SET to a specified register.  
     Move the notes from the deleted insn to its replacement.  
     Emit the insn AFTER all the notes are transferred.
     This is cheaper since we avoid df rescanning for the note change.  
     Now we must handle REG_EQUAL notes whose contents is equal to the mem;
     they are no longer accurate provided that they are reached by this
     definition, so drop them.  
static struct st_expr* st_expr_entry ( )
   This will search the st_expr list for a matching expression. If it
   doesn't find one, we create one and initialize it.  

Referenced by store_killed_before().

static bool store_killed_after ( const_rtx  x,
const_rtx  x_regs,
const_rtx  insn,
const_basic_block  bb,
int *  regs_set_after,
rtx fail_insn 
   Returns true if the expression X is loaded or clobbered on or after INSN
   within basic block BB.  REGS_SET_AFTER is bitmap of registers set in
   or after the insn.  X_REGS is list of registers mentioned in X. If the store
   is killed, return the last insn in that it occurs in FAIL_INSN.  
         We do not know where it will happen.  
     Scan from the end, so that fail_insn is determined correctly.  
static bool store_killed_before ( const_rtx  x,
const_rtx  x_regs,
const_rtx  insn,
const_basic_block  bb,
int *  regs_set_before 
   Returns true if the expression X is loaded or clobbered on or before INSN
   within basic block BB. X_REGS is list of registers mentioned in X.
   REGS_SET_BEFORE is bitmap of registers set before or in this insn.  

References st_expr::antic_stores, can_assign_to_reg_without_clobbers_p(), function::can_throw_non_call_exceptions, cfun, extract_mentioned_regs(), find_reg_note(), may_trap_p(), st_expr::pattern_regs, side_effects_p(), and st_expr_entry().

static bool store_killed_in_insn ( )
   Check if INSN kills the store pattern X (is aliased with it).
   AFTER is true if we are checking the case when store X occurs
   after the insn.  Return true if it does.  
         A normal or pure call might read from pattern,
         but a const call will not.  
         But even a const call reads its parameters.  Check whether the
         base of some of registers used in mem is stack pointer.  
     If this insn has a REG_EQUAL or REG_EQUIV note referencing a memory
     location aliased with X, then this insn kills X.  
     However, if the note represents a must alias rather than a may
     alias relationship, then it does not kill X.  
     See if there are any aliased loads in the note.  
static bool store_killed_in_pat ( )
   Go through pattern PAT looking for any loads which might kill the
   store in X.  Return true if found.
   AFTER is true if we are checking the case when loads kill X occurs
   after the insn for PAT.  
         Check for memory stores to aliased objects.  

References may_be_sp_based_p(), and SET.

static bool store_ops_ok ( )
   Return zero if some of the registers in list X are killed
   due to set of registers in bitmap REGS_SET.  

References extract_mentioned_regs_1(), and for_each_rtx().

Variable Documentation

struct edge_list* edge_list
   Contains the edge_list returned by pre_edge_lcm.  

Referenced by compute_insert_delete(), and pre_insert_copies().

int num_stores
   Global holding the number of store expressions we are dealing with.  
sbitmap * st_antloc
sbitmap * st_avloc
sbitmap* st_delete_map
   Nonzero for expressions which should be deleted in a specific block.  
sbitmap* st_insert_map
   Nonzero for expressions which should be inserted on a specific edge.  
sbitmap* st_kill
   These bitmaps will hold the local dataflow properties per basic block.  
sbitmap * st_transp
struct st_expr* store_motion_mems = NULL
   Head of the list of load/store memory refs.  
hash_table<st_expr_hasher> store_motion_mems_table
   Hashtable for the load/store memory refs.