1. In a dataflow,
if there are errors in the source (oledb or other) ,
for example, a conversion to date fails at some column – row,
there may still be data that were committed to the destination OLEDB before the error happened.
The Rows Per Batch field is usually empty or -1, which defaults to 1000,
so the batches that were committed before the row that threw the error are already in the destination table,
and we need to delete them manually,
or add a sql execute component to do that, after the dataflow (on failure of the dataflow).
2. If a component inside a container fails, the container does not fail !!!!!
Which means, components after the container will run.
So, a staging dataflow inside a container failed because of errors in the data,
the file archive component after the container ran and archived the file although the staging failed.
To avoid that, the properties of such components should be set to
fail the parent and fail the package. By default those are set to false.