February 2006 Archives

Insanely... Disappointing


I've got to say, today's big product announcements from Apple are something of a let-down.

Great Big Hype... Itty-Bitty Desk Space

The big news is obviously the Intel Mac Mini. It's better in some respects -- a faster processor never hurts, and the new version of Front Row, with the ability to stream video from other Macs on the network, seems to adequately equip the Mini to reside in the living room, as the last outpost of your so-called iLife.

Of course, the dream starts to fall apart pretty quickly; Apple's remote is proprietary and undocumented, so you're not going to be controlling your Mini with a universal remote. It's also not open to developers (not that that will stop anybody), which means we're unlikely to see the fine folks at El Gato joining the party any time soon.

I'm also concerned with the video output quality; for folks with HDMI or DVI connectors on their HDTVs life may be sweet, but those of us still living in S-Video land have to use an adaptor, which historically results in a picture that looks almost but not quite entirely like crap.

Intel Killed the Radio Star

Aside from the jump from a PPC to an Intel Core chip, the new Minis also made another Intel jump -- they drop the ATI chip that had provided video to the previous generation, and move to one of Intel's integrated graphics chips. I'm up in the air about this one; on the one hand, the modern Intel graphics chips don't really deserve their terrible reputations -- they can scroll text as prettily as anything else you'd find. And obviously Apple's gotten Core Image and Core Video to work with them (we know this from the developer test machines, which ran an Intel integrated video chip as well).

The real problem is gaming, and this is where I'm torn. It's unlikely you'll have a good time playing WoW on the new Minis, but on the other hand most people who buy a Mini have no interest in WoW. Most folks who want to game on their Mac are students, who tend to hit the iBooks (or MacBooks, as they'll doubtless be called) and iMacs, or random geeks like myself who will gravitate towards the MacBook Pros and whatever replaces the Power Mac line. And those are all great machines for gaming. The target audience for the Mini -- at least as far as I can tell -- are the folks who'll spend most of their time in Mail, Safari, iPhoto, and iTunes.

The real problem is probably one of expectation; the fact that the new generation can't do (or won't do as well) something the previous generation did adequately will rankle some, and the traditional Mac-user-bitching-echo-chamber will ensure that everyone hears about it. It's one of those situations where either choice was wrong, and I guess I can't fault Apple for staying on-message.

iPod, uPod, wePod... No, from now on only iPod

The other big news -- where by 'big' I mean 'small' and by 'news' I mean 'grab for cash' -- is Apple's foray into the purportedly billion-dollar iPod-accessory industry. On the one hand, it's only fair that they grab a slice of the pie that they baked. On the other hand, these new products aren't really much to write home about.

Hi, Fi!

The iPod Hi-Fi has got to be the most God-awful ugly product Apple has ever produced. And remember, kids, I used to have an Apple Portrait Display! They can talk all they want about how great the sound quality is, but that's not going to sell to most folks -- I can't hear a web page, and most stores are too noisy to even hear the salesperson (thank God). And the true audiophile set -- the kind who prefer vinyl because it's 'warmer' and like vaccuum-tube speakers because they're 'costlier' -- aren't going to be caught dead with an iPod. They have far too much ego invested in believing they're better than we compressed-music-listening plebeian cretins.


A leather case for my iPod? Wow, neat!

A hundred bucks? Hey, look, iPod socks!

.NET From a Mac Perspective


I'm currently taking a course in .NET programming. It has been -- and continues to be -- an interesting experience.

I've been a Mac guy for as long as I've been using computers. I can use Windows, and in fact I get paid to do so on occasion, but for my own purposes I avoid Windows like the plague. Nonetheless, it's nothing dogmatic with me -- unlike some Mac users, I don't feel like we need to wage a holy war against Redmond. The world has far too many Jihads ongoing already.

Despite my familiarity with Windows, my programming experience has been uniformly Mac- and *nix- based. I'm familiar enough with MFC to recognize when a program is the result of a severely lazy developer overusing it, but not enough to actually create a Win32 binary.

First Impressions

First off, Visual Studio is a very nice IDE. Microsoft's code completion -- 'Intellisense' -- is very well done. If I'm working with strings, Visual Studio can show me the various manipulation methods about as quickly as Xcode can show me the selectors for an NSString -- despite the fact that Visual Studio is running in emulation. Apple really needs to get their ass in gear on that count.

The Windows Forms Designer, to put things simply, sucks. It's like somebody looked at HyperCard but never got to actually use it, and then went off to clone it. If you want to change the text in a label, you need to view its properties and type into a tiny little box -- if you just double-click expecting to edit the text, it brings up your source file and lets you write a procedure to be invoked... when the user clicks on your label.

That's right: when the user clicks on your label. The reason that so many Windows apps have crummy UIs is suddenly becoming more clear to me. It's a pretty basic fact of consistent UI design that labels don't actually do anything; the fact that the default path in Microsoft's standard development tool makes them into buttons is, to say the least, alarming. Making things work the wrong way should never be the default behavior -- in fact, doing something as nonstandard as this should be bloody hard work that only the most dedicated hackers could pull off

Also interesting with Windows Forms is the inability to decouple control from view (a la MVC). Each form (which in the real world we call a 'window') is an object, and each UI element is configured to execute a method of this object as it's manipulated. I'm not saying this is wrong, per se, but it takes some getting used to.

Speaking of things that take some getting used to, these form objects are also in charge of creating the windows and UI elements they control. Unlike the Mac, where windows are generated by system calls based on data files (either resources with the Toolbox or Nibs in Carbon or Cocoa), the Windows Form Designer just generates code and inserts it into your source file.

I can see benefits both ways. The biggest problem with the Windows method is that it isn't hard to get the two out-of-sync to the point where things become difficult, so you're better off just leaving the generated code alone -- which begs the question, why is the code generated at all? Why doesn't the form editor just directly produce the MSIL and hand it to the compiler to insert into your binary?

More to come. Stay tuned.

Visual Studio and Virtual PC


I'm currently taking a course in .NET programming. It's an interesting experience in a number of ways, and I really ought to stop being lazy and write about them.

Until then, if any other Mac user out there in Google-land is using Visual Studio .NET 2005 in Windows 2000 under Virtual PC 6.1, the trick to stop the random crashes is to disable the 'vshost' debugging introduced in VS.NET 2005. Just bring up the project properties, go to the 'Debug' pane, and uncheck 'Enable the Visual Studio hosting process'.

VSHost sounds quite a lot like the 'fix and continue' feature Apple added to Xcode a few versions back. It seems to work just about as well.

Not A Full Deck


Hillary thinks that Republicans are playing the 'fear card'.

She's almost right. Republicans are playing the terror card. As in, "Hey, look at those terrorists! Aren't you glad we have an army?".

I get that Democrats don't want to be scared. Being scared is no fun -- in fact it's the exact opposite of fun. It's supposed to be uncomfortable -- it's a complex evolved behavior that creates mental discomfort and motivates us to avoid or alter the situation that's scaring us. There's a point to fear - and that point is to get us off our asses and paying attention.

Sez Miz Clinton: "You cannot explain to me why we have not captured or killed the tallest man in Afghanistan".

Actually, I can: because he's simply not a priority. Osama bin Laden is not magic. He cannot fly, he cannot fire laser beams out of his eyes, and his ability to breathe fire is suspect as well. Capturing Osama bin Laden would have no greater tactical benefit than capturing any other high-ranking Al Qaeda member.

Al Qaeda is not a corporation, with the CEO making the tough calls. It isn't an army, with the General deciding on the battle plan. Al Qaeda is a distributed network -- yeah, just like those distributed networks you use to pirate MP3s. And the thing about distributed networks is that cutting off their head really doesn't do anything.

Hillary's comment shows her to be of the same mind as Kerry (read the last two paragraphs of this post). She sees the war on terror as revenge for 9/11, so clearly capturing Osama is the priority. Or perhaps she sees the war on terrorism as a way to placate the bloodlust of we idiot Christian rednecks, and that's why capturing Osama is a priority. It's really hard to say what Hillary Clinton believes, because her opinion changes every time a new poll comes in.

Doors of Perception, Halls of Medicine


I got smashed on antibiotics and wrote this.

This is similar enough to make me chuckle.

Begun, the Cartoon Wars Have



The question we all should be asking ourselves is, are these people Ramen or Varelse?

Safari Can Be Kinda Dumb


So I updated the server to 10.4 today (plus every last security update, so don't even bother). Things went about as well as could be expected; I had to reconfigure Apache and Postfix, I had to keep MySQL and PHP talking, a few things didn't 'stick' until I toggled them off and on a few times (including, bewilderingly, sshd).

The most surprising part of the whole experience wasn't anything on the server at all, it was Safari on my soon-to-be-replaced PowerBook.

Immediately after getting things partially back up, I pulled out my laptop and checked various things out, including Movable Type. Because the various document roots weren't configured yet, Safari was understandably unable to load the images, CSS files, and JavaScript files used in the MT user interface. That's pretty easy to forgive, seeing as the fault was with the server.

The strangeness came after the server was properly reconfigured: when I logged in to MT on my laptop, everything still looked like garbage. When I brought up the activity window (cmd-opt-a), 90% of it wasn't loaded! And yet when I opened a new browser window and typed in the URL to that resource, it loaded just fine. If I hit the reload button enough, eventually a single page of the MT UI would start to look okay -- except the next page would be back to looking like crap, unable to find the same blasted CSS file I had just managed to convince the previous page to load.

I'm still not really sure where the problem is. It could be that Safari is checking the revision dates from headers and deciding that the the 404 error pages it saw last time are still current, it could be that Safari's cache is just stupid, it could be that Safari's cache on this particular computer is corrupted...

Very odd. I finally broke down and emptied the cache, and now all is right with the world -- except for the fact that everything loads slowly and will continue to do so for a few weeks at least.


Skirwan's Scrapbook


As long as I'm doing all this web revamping, I figured I might as well put some of my old Clan Lord images up somewhere. Check it out.

I'm Going To Call It PowerBook Anyway


I just placed my order for a MacBook Pro.

I don't expect to see it before early March, so I doubt I'll get the chance to claim the nearly $10K bounty on Windows XP compatibility, which is sad.

Listening to some of the clamor about this, I'm forced to the conclusion that a lot of people are just very, very dumb. For instance, this post by some guy nobody's ever heard of is being hailed like it's the second coming or something... but it's really kinda silly.

Like, for instance, step nine -- "Use the Bootable Vista DVD to boot on the MacBook". The far more clever folks at Ars Technica already tried that. You can't boot from a DVD that the computer won't boot from.

That's just the most glaring. Step 16 is the sort of thing I'd expect to find in a business plan written by underpants gnomes -- It's like writing "and then a miracle occurs" when you're detailing a chemical reaction. Step eight is just as dumb as step nine, unless you assume he means to do that on a separate PC... but if you're doing all this on a separate PC you might as well just boot from another drive and do this all booted in Windows or some flavor of Linux.

The first step to figuring all this out would be getting an open source bootloader (probably grub) working. Worrying about faking MBRs and chaining unperfected bootloaders from beta operating systems is putting the cart before the horse. The (Intel) man pages for hdiutil, bless, and nvram would be helpful, and I'm rather dismayed they haven't shown up on the web yet.

Ah well. I'd rather have something like QEMU anyway -- dual-booting is such a pain.

Master of My (Own) Domain


Welcome to SeanKerwin.org.

I feel so special.