You and I care because we are human, with genes that direct us to propagate those genes. Anything beyond reproduction is merely an accessory to the success of this reproduction; all intellectual, social, and moral goals are a means to the end.
Yet I have never directly tried to maximise my reproductive fitness using my intelligence, and on many occasions I have even knowingly worked against it; anything I do may well be linked to those goals of reproduction, but it may well contradict them too. Actually the only goal where I and my nature agree wholheartedly is the goal of self preservation.
I've worked with computers for nearly half of my life (programming, playing with some AI, etc.)
We're speaking about AGI, and of the trans-posthuman sort. This is not the same; I may well have worked with bacterias for days and weeks ad nauseam, I understand them quite a bit, but it didn't provide me with much direct insight into the working of human beings psychology.
Your questions also question the utopian predictions of human desires for future AI.
Yes they do as well.
Also, I don't say it isn't possible, or even likely. As a matter of fact, if you plan to live beyond a few centuries, and especially if you plan to go posthuman, I think in most cases you'll run into that issue you're speaking of, seeing how your goals are arbitrary. Heck, I'm already running into it now. The answer ? Maybe it is that most of what it means to be human can only exist at a certain level of intelligence and consciousness, and that we can't be stable at another level, that we'd need othe motivational systems, other goals, or at the very least, ways to protect our goals and not fall into madness or existencial nihilism.
why should it continue to exist if it has no reason to? It is not like life, which is motivated by the greatest need - reproduction and survival for the sake of reproduction.
Well, the single most intelligent thing I've heard in the matrix trilogy :
"Because I choose to"
Or if you prefer (because I think that statement will be misunderstood), and if you've read Egan, remember about that guy who decided to rewire his brain into loving the act of creating chair and wooden furnitures for centuries on end (was in permutation city) ? There's also a similar case in diaspora, where a guy decide to rewire and freeze his mind state into some sort of illuminated buddhist or something, and who'd not be swayed by any argument anymore, because he just couldn't ever change his system of belief, and goals.
That's what I mean by "do not care", you care because caring is a part of your nature, and as much a result of evolution as those other goals contingent to reproduction. What if for a start we have a mind whose goals are different, and who's wired to protect those goals, protective measures which could for instance be "I don't care about that" or "lalala I can't and won't hear any of that" whenever something that could threaten its goal's sense would arise (but which could certainly be a lot of other clever -or dumb- things) ?