Simplist way to put it:
1080P - Looks like you are right there filming the movie
720P - Does not
That's subjective, though. The same thing could have been said about 480p versus 720p. And when 4K (or possibly UHDTV, which ever wins the battle of being the next big standard) comes out, the same can be said about that versus 1080p.
Also, the type of video compression plays a huge deal. Usually for professional released Blu-rays, this is not a problem. But, if someone is releasing a fan sub or something, they could chose to use poorer quality compressing, which will take faster so they can release it sooner, but will result in a picture that doesn't look so great.
Keep in mind that distance from the display plays a big role. A TV that is 32" or smaller at 720p versus an identical size 1080p TV will effectively not have much difference, because with a TV, you sit further back from the TV and your eye won't be able to pick up such small detail. Computer screens it is easier because you are only a couple of feet away.
And, in case you were wondering, the 'p' stand for progressive, while 'i' stands for interlaced. It is essentially how the signal is sent. Interlaced (regarded as the inferior method) splits the frame into two signals, even and odd numbered horizontal lines, which will result in the picture having horizontal lines. Progressive sends the data for each frame as a whole, thus no lines and a better picture. Not that it matters too much for computer monitors, this is found in TVs.