Dispatches, thoughts, and miscellanea from writer Jon Konrath

Sick, Mars

I went home sick yesterday, with something stomach-related that felt like the flu or something. I don’t know. I’m here today, and I feel a little better. I spent the afternoon in bed, reading Red Mars and/or taking a nap. It’s a very cool book, and it covers a lot of details about Mars colonization that people wouldn’t think about, like money, religion, different countries competing against each other, and that thing.

Things I’m wondering about include what kind of power outlets they use, if the TVs are PAL or NTSC, if the women take birth control or something else is used during these frequent plot-building extramarital affairs, and so on. It also seems odd that they’ve been there about 20 years (at page 360-some) and they can build about anything. Assembly lines can build complete rover cars and biosphere domes, but there are only 10,000 people on the planet. I guess it’s the robots they have, and maybe it’s just easier to build stuff with less gravity. Who knows.

I’m eating toast for lunch. It was in the fridge all morning, so it’s pretty awful.

This morning, I started thinking about what would’ve happened if back in 1989, I would’ve hit myself in the head with a brick, started studying 40 hours a week, and changed to CompSci right away. I don’t know why I torture myself like this, because I’m sure when I turn 40, I’ll be wishing I would’ve had a moment of clarity in 1997 and started saving every damn penny I make.

So I’m thinking about this, and then I start thinking about computers and artificial intelligence. Would a computer know its limits? In a sense, even the most basic computers know their limits – if you divide by 0 or enter a number too big, it will stop and give you an error. Why can’t humans know their limits like this? Is it because our greater ability to think stops us? I mean, even our bodies know our limits. If you drink 20 shots of rum back to back, your body will make you puke. But your mind didn’t make you stop after shot 2 or 12 or whatever. It might just be a bad analogy – with a computer, it is a simple matter of checking a circuit, whereas with a human, it is a more complex and fuzzy process of making value judgements.

I need to read more about AI before I go off on these weird tangents… And I need to go finish my applesauce…