On the SIGCSE mailing list, a bunch of us crusty old professors reminisced on how we used to program with punched cards, on how Grace Hopper would hand out wires that electricity would traverse in a nanosecond, and so on, and how students would just shake their heads when we droned on about it. I was about to contribute how in my pre-college job, I had to enter the boot loader to some piece of equipment in octal, but then I realized that this, while being good clean fun for those who do the reminiscing, is not fair to the students.
So I tossed in that my grad students don't have a concept of a "server room" because their projects run somewhere in the cloud, and that more of the undergraduates are baffled by the concept of a "laptop" because for them, computing happens on a phone or maybe a tablet.
And I got some heartfelt responses how an Amazon data center isn't really any different from a server room, or how laptops will always be around. So, it's not just the students who extrapolate from their experience.
It's pretty amazing to be in an environment where the most basic assumptions on how one works get fundamentally shaken up every twenty years or so. I don't know if I will teach twenty years hence, but if I do, my students will probably be confused by the concept of a "data center" or a "mobile device".