i think cars have a long way to go in terms of input. the command line is to the steering till as the mouse is to the steering wheel. not to say that mice are dead, but now its not the primary way we interface with technology

what is the trackpad of cars? whats the touch screen? whats the wii remote? whats the kinect? whats the eye tracking?

i dont think these things are immediately compatible, as computing and driving have entirely different utilities, haptics, and emotions attached to them, but i seriously wonder how we can be more intimate (?) with mobility, especially in an age where ICE is seen as peak intimacy.. ICE does not define mobility, so how can we make EVs more intimate? does the feeling of intimacy with ICE stem from the systems of a car being a simulacra of our own biology? i feel that makes the most sense. in that case..

does a person with a pacemaker feel human? shouldn’t they? one day, when we inevitably make a more efficient heart replacement, should they not be afforded the same humanity? parallels..

why do robotics companies so often attempt a humanoid form factor? humans are decidedly not built to be the most efficient at the majority of tasks we would want robots to do, yet we want something like us..

i wonder how much of this is nature vs nurture.. i think we’re only truly being faced with these questions this now because honestly iPad babies are a thing and its interesting to say the least.. how does the human brain react to being afforded tools like LLMs from the start or perception?

as the world gets more technologically binary, we (designers) need to account more for the human analog.. haptics, etc etc etc