I have a much more clear concept of the coordinate system of Cocoa through these two days' work.
Firstly, the drawRect function of NSView doesnt simply draw on top of the context. The z-index couldn't be determined. Thus sometimes things that were drawn first may not covered by things drawn later. The reason behind, after googleing it, is that NSView is not for overlapping but subviewing to each other to form a view hierarchy.
So, in my case for a map editor, each component should not be regard as a subview in the main view representing the map. Instead they should be drawn directly on the drawRect function of the main view.
Secondly, here is the sample code of rotating a image.(not 100% complete as I not yet handle the case that the angle is 0, but should be pretty easy :p).
The concept would be better described by illustration but forgive my laziness.
Brief steps of rotating a image at its center:
1. create a NSBezierPath base on the original image bound
2. transform the NSBezierPath to get the bound after the image rotated
3. create an NSImage with the size equal to the size of the rotated bound
4. center the image in the rotated bound
5. set the NSAffineTransform: move to center of the bound > rotate > move back to the original position
6. lock the focus on the newly created image and draw the original image to its bound. done=]
In the drawRect function, calibrate the rect of the image. Otherwise the image will not be "anchored" on its center
Don't know whether there exists a better method (sth like simply putting the image in a NSImageView and rotate the view). But it at least works for me