Body posture is one of many visual cues used by sighted persons to determine if someone would be open to initiate a conversation. These cues are inaccessible for individuals with blindness leading to difficulties when deciding whom to approach for eventual assistance. Current camera technologies, such as depth cameras, enable to automatically scan the environment to assess the approachability of nearby persons. We present Whom-I-Approach, a system that translates postures of bystanders into a measure of approachability and communicates this information using auditory and tactile cues. The system scans the environment and determines the approachability based on body posture for the persons in the vicinity of the user. Efficiency as well as perceived system usability and psychosocial attitudes are measured in a user study showing the potential to improve competence for users with blindness prior to engagement in social interactions.