Imagine that some politician said "Cars are an important part of our modern economy. Many jobs involve cars. Therefore, all students must learn how to build cars; we will be adding it to the curriculum." This does (to me, anyway) seem pretty ridiculous on its own. Now imagine that the students are being taught to make small car models out of card, whilst being told that this is actually how it's always done (this sort of falls apart, since with the car thing it's easy for the average person to tell; most people can't really distinguish these things with programming). This is what the push to "learn to code" in schools (this was written in 2017, when the UK's government was making a big thing of this, but it still seems to be going on now, in 2020) seems like to me. In fact, it's even worse, with hyperbolic claims of it being "the new literacy", often made by people who have never done anything beyond basic block-dragging in Scratch or some equivalent.
The average person is definitely going to have lots of interaction with things which have been programmed by someone else, given the increasing popularity of mobile phones. This does not, however, mean, that they must know every detail of how they work (not that, at this point, anyone can - they're just too complex), and they wouldn't actually be taught this by the "learn to code" things now done in schools. Most of the "learn to code" resources, especially those done in schools, start with very simple, visual, 2D-graphical environments. This is fine for learning a few basic things (though not very good - Scratch's weird programming environment maps poorly onto actual widely-used languages), but there doesn't seem to be anything beyond that taught most of the time - it's considered "too hard" for the students involved, usually, or there just isn't anyone qualified to teach it. And that basic dragging around of blocks is not hugely useful - it doesn't teach much (maybe basic concepts of flow control), and you may have to unlearn things when moving to actual programming.
I have an alternative list of things to teach which I think might actually be relevant and helpful to people in a way that making a cat dance on screen by blindly following a tutorial is not:
- an introduction to computer hardware (for troubleshooting, etc) and what all the myriad cables do
- basics of networking (what routers do, ISPs and their job, one of those layered network models, HTTP(S), DNS)
- privacy in the digital age (i.e. maybe stop giving Facebook/Google/Amazon all your private information)
- operating systems, what various programs are for, and the fact that ones which aren't Windows exist
- what programming involes
- basic shell-or-equivalent scripting (though this may not actually be very useful either, as the OSes people mostly interact with now - iOS, Windows, Android, etc. - disallow this sort of thing or don't make it very useful, sadly)
- fixing basic problems using advanced IT techniques such as "using a search engine to look up your issue" and "blindly tweaking settings until it does something"
This doesn't really sound as fancy as teaching "the new literacy", but it seems like a better place to start for helping people be able to interact with modern computer systems.
Having shown someone this post, they've suggested to me that Scratch is more about teaching some level of computational thinking-type skills - learning how to express intentions in a structured way and being precise/specific - than actually teaching programming, regardless of how it's marketed. This does seem pretty sensible, actually. I can agree that it is probably useful for this, since most people will enjoy making visual things with direct feedback than writing a bunch of code to print "Hello, World!" or sort a list or something. Still, it definitely does have limits for this given that it's quite lacking in control flow capability and abstraction compared to regular programming languages. Also, it's not really marketed this way, and thus probably not taught that way either.