Resolution of a display is the total number of pixels that the display contains.
The actual hardware density is constant, but the density that is presented to the software can be changed.
If a program needs to display something that needs to look a certain height, and the reported display density is 160 ppi, the program will use 16 pixels, but if the reported density is 200, that tells the program that the pixels are smaller, and it will use 20 pixels to present the same object. However, since the resolution is unchanged, the object will actually look bigger.
I hope that clears it up for you.
When there are more Linux-based devices being hooked up to monitors and projectors, than there are Windows PC's, can we say that it's the year of Linux on the desktop?