So if you are familiar with SQL joins it tends to make more sense, but basically when you set up a report in there you have to be very careful about the objects you include in the columns because the report system is basically stringing joins together in whatever way you tell it, regardless of if that join is a good idea. Most likely the columns you picked are crossing the data sets giving these unexpected results. I'd say the best way to learn it and find where the mistakes are being made is to start off with an OOTB report and validate that data looks believable. Then you can add columns 1 at a time, running the report after each change and watching for where the data might get weird results. It took me a few tries when I first used the tool to wrap my head around it, but eventually you just learn that you can't mix certain sets of data in a single report, and it usually ends up being better to make many relatively tightly focused reports instead of trying to string a bunch of different things together that don't live on the same db tables together.