The word “tech” has become ubiquitous.
It’s a descriptor: high tech, tech geeks, tech jobs—all these refer to that which revolves around microchips and the software that runs on them, and we occasionally throw in lasers and DNA to round things out.
Some educators have instituted “tech breaks” for their students, not to get students away from their electronic devices for a few minutes, but so they can use them.
Tech also refers to the materials themselves: all the various things with screens and connectivity get lumped into a vast category, tech, which, when we’re done with it, leads us back to a descriptor: tech waste. (And we still don’t have a solution for that.)
Richard Parry, writing in the Stanford Encyclopedia of Philosophy, describes the origins of “tech” in the ancient Greek term technê, which can be contrasted with the term epistêmê. Technê encompassed art, craft, practical application, and epistêmê generally included knowledge of fixed forms, nature—which was assumed not to change—virtue and other abstractions, and math.
We can see this more recently in distinctions between applied and theoretical sciences, between so-called “blue sky” research and the development of new products and designs.
But with the rise of market-based everything, this distinction is starting to collapse; if it’s unlikely to lead to a new product, it’s unlikely to be funded.
That’s a great loss. As tech flourishes, reason declines: we retain the what but we forget the why.