Simply put, your eyes see light. That is, the only information your eyes receive about the world in order to construct an image in your head is the photons that have bounced off of objects near you and into your eyes. So constructing a photo-realistic image is (almost) all about simulating lighting. When there is no light, there are no photons to interact with your eyes, and so obviously, you don't see anything.
In everyday life, there are light sources all around us to provide light, from lamps in our houses or in the streets, to the sun during the day and the moon (or perhaps just the stars) at night. Every light source has characteristics that we are familiar with; for example, the sun is much brighter than most other light sources we experience, and has light of a particular color (white), and occupies a position in the sky at various parts of the day, and so forth.
A light source emits photons (and our eyes receive them) at an astronomical rate; the number is so huge that it is difficult to relate it to numbers in our everyday experience. Each photon a light source emits will travel in a straight line forever, until it interacts with some matter (as a matter of fact, the photons that enter your eyes when you look up at the stars at night are the same photons that left the stars millions of years before and have traveled across the universe in that time to enter your eye). When a photon does interact with matter, several things might happen.
In the simplest case, a photon strikes matter and reflects off. In this case, the photon will typically carry information about the color of the object it struck and continue on in the reflected direction. The specifics of the color can depend on the circumstance. For example, if a white light reflects off a brown table, the photons that reach your eye will carry information about the brown color of the table, not the white light. If a red light reflects off of a white table, it will probably carry some information about the original color of the light and reach your eye a shade of red.
However, the photon does not always reflect off. What a photon does when it interacts with matter depends on a great many optical properties of the matter. For example, surfaces can have varying degrees of reflectiveness, specularity, opacity, and surface smoothness (amongst others). Photons that strike very reflective surfaces are much more likely to reflect off in a predictable direction (the direction you would expect from looking at a mirror at an angle), but if they strike a non-reflective surface, they're much more likely to scatter in various directions, or simply be absorbed, so that the photon does not continue on any further from the surface. Similarly, a photon that strikes a particularly transparent surface might not bounce off, but instead continue onwards, potentially having been diverted to a slightly different course (this change in original heading is called refraction, and it is manifested, for example, by the distortion you see when you stick a pen into a glass of water).
You might look at the room you are sitting in, with all its finely detailed surfaces, and many intricate objects, and numerous light sources, and imagine all of the photons zipping around the room, bouncing off of objects, through objects, or getting absorbed, or perhaps bouncing off of objects numerous times, all before finally reaching your eyes. The final picture your brain processes is constructed from a huge number of photons, all of which have taken a potentially complex path in their journey from the light source to your eyes.
The problem becomes apparent. We cannot hope to ever, even with futuristic computers many years hence, hope to model all of these light interactions as individual photons leaving light sources and zipping around the scene before heading into our virtual camera lens. Instead, we must find more computationally tractable ways of approximating the situation we've described above. The most obvious technique for doing this is called raytracing.