It’s difficult to give a simple definition about the concept of entropy. This concept comes from the second law of thermodynamics, and it is described by a physical equation that only physicists can understand. This second law seems in opposition with the first one, but, in fact, they are complementary. The first law could be resumed in a simple way: « The change in the internal energy of a closed thermodynamic system is equal to the sum of the amount of heat energy supplied to the system and the work done on the system ». The second law says, in short, that it’s easier to lose than to get energy. Robert Smithson explains this idea with an example: a child running clockwise, between a white and black sandpit, mixes the sand. If the child decides to run in the other way, the sand doesn’t come back to the original position, but becomes more and more mixed. He says that the only way to make the sand coming back to the first position is with a movie showed backwards. But also the movie is subject to entropy, so, after some time, it couldn’t be seen anymore. The numerical word is subject to entropy too, but it’s difficult to be aware of it. We know that every time we copy an image, we lost some information from it, and, after a lot of copies, it becomes illegible. I try with this video to show at the same time this problem and Smithson’s ideas about entropy.