One way of looking at the last forty years is as an enormous fuss made over digital printing.
Digital Printing starts with a cubic polynomial, for which we need to find the roots. Don't ask me how that's done. I'm still keeping a close eye on Mr Albert Einstein in case he tries to Eigenvector his way out of the only Linear Space that he is permitted in this home.
If I need to make sure that something doesn't go missing in my home, such as when I have something which reminds me what happens when someone reminds me of someone else, I have to assign it a Linear Space. I'm not very good at this task generally because I spent too many years embroiled in debates over systems of classification.
And if one has things in boxes then one could write a puzzle-solving algorithm (which is one things which falls under the term AI) to stow them away optimally.
We're always looking for ways to make our skills useful, but if I can't do something as simple as taking a glyph from a typeface definition (which is where the cubic polynomials come in) and turn it into a raster (which, as we know, is how all visible output must be encoded), I'd have to stick to networking. But even here I would, if responsible, be always ready to produce an easy to understand diagram for anyone who might want to make changes in future.
Which puts me back to square one, because everyone knows that whatever we draw on a piece of paper, we can also print.
It must be understood that digital printing has its roots in postscript, which is a reverse-polish language (which gives everyone a headache), and which was designed without much thought given to memory requirements (few scripting languages are). Also, not all shapes are cubic. A circle might be approximated with a number of polynomials such that it would be circular to the naked eye; some shapes, however, have unwieldy dimensions.
With my love of Maclaurin Series definitions I can little tolerate a function being chopped up just for an effect.
The lack of consideration for memory requirements gave years of headaches to technical departments, who would be shown an error page and asked how to fix it. It depended on what printer you had, what app you were using, and any other factor that could be depended on to not have been taken into consideration by the designers of postscript, as to how often a person would find themselves looking at a full page printed with seemingly random text, or just the word error.
That is, until the cost of memory had come down sufficiently to allow desktops and printers to contain a full page at six hundred dots per inch in its memory: no less than a hundred and twenty megabytes. Three hundred dpi was not good enough as parents are always proud of their eagle-eyed children who can see aliasing at that lower resolution.
This only happened at the turn of the century, when we started hearing about printers which were just as effective as the older methods of printing; and it was then that we had to stop talking about desktop publishing.
But digital printing, using that older term, had been established in the nineteen-eighties, when home computers only had one single megabyte, at most (print shops used plates which were computer cut and gutenberg types arranged by machine).
Publishing businesses took to digital printing because it meant they would need fewer employees.