Manipulation of hand-held objects in Virtual Reality (VR) requires input tracking with high freedom of movement, as well as haptic feedback of hand-object interactions. Through our prototypes we demonstrate a pragmatic approach to haptic feedback on controllers that render human scale forces. Our devices manifest haptic simulation of compliance, texture, surface normals, sizes, weights, and kinematic forces. These are brought to bear on hand-object interaction primitives such as palpation, manipulation, grasping, squeezing, cutaneous touch, stable grip, dexterity, and precision manipulation, which are collected as a taxonomy and represent a layer between the inherent haptic properties of the objects and the hand interaction of the operator. We implement prototypes that simulate the functional affordances of each of these aspects, and characterize their performance in human perception studies. Our work offers a model of hand-object interactions that goes beyond force rendering on a finger-by-finger basis (as typical of hand exoskeletons and gloves).