In response to your request for an interesting story on early computing, I have an anecdote.
I was a consultant at the Computer Center from 1981 to 1985. Around 1982, a seventh-grade class from a local school came for a tour, and I was asked to lead it. I took them into the basement to see the “computer,” an IBM mainframe, but knew they would be bored to see little more than red- and white-paneled boxes. The tape drives, spinning forward and backward, intrigued them a little. Then I shared with them the statistic that a computer could do an operation in the time it takes light to travel one foot – having some idea of the speed of light, this blew my mind, but again, I realized that they would be nonplussed.
Finally, I thought I could show them a simple multiplication. We went back upstairs and sat around a terminal, and I loaded an early program called APL (which stood for “A Programming Language”). I asked them for a large number. They all volunteered their biggest numbers, and I typed one in. Then I typed the multiplication sign. Then I asked them for another big number, and I typed that in. Realizing that the program would return the product in scientific notation, I paused to explain that the computer would give the answer in a strange format with an E in it, but that it would be correct. I then hit the enter key, and the answer immediately appeared on the screen. “Wows” went all around, except one. He exclaimed, “Uh, uh. You cheated. The computer had time to think about it while it was on the screen!” At a loss for how to explain myself out of that, I accepted defeat.