Retinal prostheses are used to restore vision to individuals with vision impairments caused by the damaged photoreceptors in their retina. Despite the early successes, designing prostheses that can restore functional vision in general, continues to be a challenging problem due to the large number of design parameters that need to be customized for individual users. Gathering data using real patients in a timely and safe manner is also difficult. To address these problems, a virtual environment for realistically and safely simulating prosthetic vision is described. Besides supporting phosphenized rendering of images at different resolutions to normal users, and eye movement tracking, the environment also supports spatial distortions that are commonly perceived by prostheses users. A procedure to automatically generate such spatial distortions is developed. User corrections if any, are logged and compared with the original distortion values to evaluate distortion perception. Experimental results obtained in using this environment to perform various visual acuity tasks are described.