Ian Bogost has written a lovely little essay for the Atlantic, musing on the ends of technology and their impact on human dignity. His conclusion is fundamentally pessimistic, that humanity is perhaps blindly and inexorably headed towards a state where people work for their machines rather than the other way around.
To briefly rehearse the argument, Bogost notes that technological innovations are often "precarious", that is, they create new problems even as they address old ones. The automatic toilet, for example, alleviates the need for people to touch public potties, yet often flushes at inappropriate moments. The solution: more technology, in the form of newer, "smarter" sensors.
The result is a kind of feedback, where technology increasingly displaces people from the management of their lives. Ultimately, technological progress would bring about a future where people cede their autonomy wholesale to their machines.
Bogost ties this sort or precariousness to the precarious labor that has become more widespread with the development of Uber and the like. However, I think that the problem he has in mind has less to do with machines replacing human labor and more to do with them replacing human initiative.
I suspect that most people would agree that autonomy is a fundamental component of human dignity (though not the only one—good relations with other people is also important). Yet, for the argument to work, we would need evidence that potential avenues for the expression of autonomy are indeed fundamentally curtailed by the sorts of innovation that Bogost points to.
Is the space for human initiative really like an island, finite in scope, being increasingly inundated by a rising tide of "intelligent" gadgets? Or, is it more like the water, which conforms to whatever container it is in but still remains wet?
To be honest, I am unsure of the answer but I can imagine, more easily than Bogost, that human initiative is robust and will find other means of expression even when apps try to curate our lives for us.
However, I could be wrong. Better Google it!