For centuries the term computer referred to a profession.

In the 1930s and 1940s computers were people, calculating complex math by hand.

In the 1800s they calculated things like mechanics and steam expansion.

In the 1400s and 1500s they calculated geometries and astronomies.

These advancements may have put the mythologists on edge, but it expanded our macro-capabilities, and the matter of reorganizing task allocation among people followed soon after.

Throughout history, when new technologies alleviated menial tasks, we moved on to doing new, greater things. Technology extends our ability and alleviates the meniality of our activity.

When television was emerging, people knew it would change the way things would be done, and while some claimed it would be the end of movie theatres, in hindsight, it wasn't. The same is true for radio, its place may have changed, from the centre of the living room to perhaps the car, but it's still around today. These technological shifts often reorganize our way of doing things, but rarely if ever, do they completely obsolesce everything before it.

So where’s that put us today?

Not really in a new position. This emergence of powerful language models may be new, but the outcome is the same. Less meniality, more impact. This is just a story of our machines, in this case, computers, advancing our capabilities.

Computation and now cognition are driving forces behind this growth of the capabilities of our digital machinery. While there is a tremendous amount of novelty to this material, I don't think it's something in itself to fear.