gulogo.gif  
 
1. Hiatus
2. RIP, Satoru Iwata
3. Let there be Robot Battles
4. Regarding pixel art!
5. 16-bit Star Wars
6. Goodbye, Spock.
7. James Randi Retires
8. More Star Wars on GOG
9. Archive.org gives you DOS Games
10. Ralph Baer, RIP.
1. Quickie: Impressions June 2014
2. Quickie: Penny Arcade Episode 3
3. Quickie: The Amazing Spider-Man
4. Quickie: Transformers: Fall of Cybertron
5. Quickie: Prototype 2
6. Quickie: Microsoft Kinect
7. Quickie: X-Men Destiny
8. Spider-Man: Edge of Time
9. Quickie: Transformers Dark of the Moon
10. Quickie: Borderlands GOTY
1. Musings 45: Penny Arcade and The Gripping Hand
2. Movie Review: Pacific Rim
3. Movie Review: Wreck-It Ralph
4. Glide Wrapper Repository
5. Movie Review: Winnie The Pooh
6. Musings 44: PC Gaming? Maybe it's on Life Support
7. Video Games Live 2009
8. Movie Review: District 9
9. Musings: Stardock, DRM, and Gamers' Rights
10. Musings: How DRM Hurts PC Gaming
Main Menu

Affiliates
X-bit labs
The Tech Zone
Twin Galaxies

Login






 Log in Problems?
 New User? Sign Up!


 
Steve Jobs: Keeping Apple Small Time?
Author: Michael Ahlf       Date: January 17th 2002

There's no denying that Steve Jobs is one of the big names in the computer industry, a man who somehow manages to get investors to chime in and support him.  He endorses the Mac as the be-all and end-all of home computing, continues to "innovate" the design (we'll cover this later), pushes it as the greatest computer system of all time, and as a side note was blown away by the Segway/It/Ginger, a product that depending on your side of the issue is either the greatest thing since the flush toilet or else a psychotic razor scooter on crack.

By the same token, we can look at the evolving Mac design in a similar fashion; critically, we can analyze its features and what users will go for.  And, in the final analysis, it boils down rather simply.  Kind of like the Segway. You see, that little "beauty" of a device has one fundamental flaw. It looks pretty. It rides pretty. It has a sexy space-age appearance. It also gets you all of 10 to 15 miles, in approximately 1 hour, after which it has to sit plugged into a wall for 4-5 hours to recharge.  You can't take it to the movies, or to go grocery shopping, unless you're already within walking distance. Likewise, the Mac sits as a testament to how NOT to get into the larger computer market.

Now, this wasn't always the case. Originally, Apple did quite well getting into the computer market.  They brought computing to the masses, invested heavily in education and edutainment software (Oregon Trail anyone?) and had managed to convince a large segment of children that Apple computers were great. A small percentage of these children retain that idea today, making Apple's following one of the most loyal around. Unfortunately, Apple made a few fatal mistakes.

Mistake #1: Cloning is good, not bad.

Apple, unlike the other computer makers in the world, has NOT traditionally made their computers with a modular design.  Want to add in new parts?  It's tricky.  Adding in new RAM and PCI cards isn't a big deal; replacing the CD-Rom or the Hard Drive, on the other hand, is messy.  They just weren't designed to be user serviced.

Apple also messed up its strategy when it allowed companies like Motorola to make Mac clones, after resisting it for years.  One of the reasons that Windows took over, like it or not, is that it wasn't just restricted to one platform.  Compaq, Dell, Gateway, and even the ill-fated Packard Bell all offered it on different machines, and any user could assemble an Intel/AMD box and install Windows themselves. When Apple turned around and let companies like Umax, Motorola, and Power Computing make Mac machines, they ran into problems.

The worst problem? The performance of the machines.  They ran, but often not well, and the licensing restrictions on the companies to use the Mac OS were extremely restrictive. Most of the machines were set to use just one OS (the one provided with them) and everything above was "at your own risk." Unfortunately, the idea of affordably priced Mac clones coming to the market these days is just so much vaporware: after seeing what happens when Steve Jobs changes his mind, even just going back to the debacle when ATi spouted off too quickly about providing hardware for a Mac line, I don't think anybody is going to take the risk of having him pull the plug on an entire line of computers.

Mistake #2: Tying the Mac OS too tightly to hardware.

Mac OS is irrevocably tied into the Mac hardware, and as mentioned above is tied very closely. Upgraded Mac OS editions have traditionally excluded older Macintosh processors, obsoleting previous machines as much as possible.  In the Wintel market, this is partially true, but not to nearly the same extent.  Want to install Win98?  You can go back a long way, all the way to a 486 at 66Mhz with 16 MB of RAM.  It might be slow, but it will still run and it's still supported. And you can get the hardware from anywhere, install the system yourself, pay a shop to do it, or have a local guru or friend do it. The newest Windows, WinXP, still only requires a machine with 233MHz and 64MB of Ram.  Try to install Mac OS X on older hardware and Apple won't give you much help at all.

Microsoft makes a PowerPC version of Windows, at least in the NT family, but Mac OS provides no similar option for x86 machines.  This makes it quite obvious as to what may happen, since machine makers like IBM are then free to build something with PowerPC parts and load it with Windows. Niche market? Certainly.  However, it still takes a bite out of Mac OS's market share.

The other downside of tying the Mac OS in so tightly to hardware is that it makes companies unlikely to build hardware into the Macs.  While having a straight system is nice for programmers, the amount of time it took for 3D accelerators to reach the Mac is one of the reasons that Mac gamers have had to do without so many of the hits that make the PC so successful.

Mistake #3: Cute and cuddly does not a computer make.

One of Apple's biggest mistakes is the underlying assumption that a sexy computer design equates to sales. In some industries, like the automobile industry, the look of the product is what's important. Other products are different.

In the computer industry, there seems to be a fascination with the ugly.  Then again, most computers don't get seen very often. They're located under desks, or sit under monitors.  They're the domain of the geek in the family, who may constantly open them up to tweak things, or they are busy being used for writing and homework and games.  In any case, nobody sits and adores the case; they're more concerned with what's on the screen.

Likewise, "innovation" is a loosely defined term.  In the loosest sense, it's merely a change. It's supposed to be a change that makes something better, easier to use, or more likely to sell.  Then again, Apple's hockey puck mouse design is supposed to be an innovation. It's not.  The addition of a second button to the mouse, and the utility that it enabled (the right-click) is a true innovation because it actually makes the computer easier to use.

So what has Apple goofed on in this sense? Let's see. So far there's the refusal to add another button to the mouse (how hard could it be guys?).  There's the G4 "Cube", that little thing that could easily be mistaken for a wastebasket.  Dare I say it, there's the iPod, which picks up at the testing phase of Creative Labs' Nomad player. Then again, the iPod is a double mistake for anyone foolish enough to buy it, considering that a jog-proof MP3 CD player can be had for as little as $130 as compared to the iPod's $400 price tag. The storage size doesn't make that much difference either, since an MP3 CD can handle up to 10 hours of music, and nobody drives/sits/jogs/runs/whatever else for that long without a break. 

We also can take a look at the new iMac design. It has a 15" flatscreen, but looks very ungainly to upgrade to 17 or 19 inches and beyond: I wouldn't trust that base to keep a 19" flatscreen, and neither should you.  GeForce2 MX video is nice, but it's hardly cutting edge. The upgrade path?  Once again, the only "easy" thing to do is add memory or an AirPort card. Meanwhile, there's this alien-looking design where the main computer looks like half of a white basketball. All this, 256MB of RAM, and an 800 Mhz processor (yeah, I know Mhz isn't everything, but when you're outnumbered two and a half to one it's hard to ignore) for a MERE $1800. Wow.  I think somebody missed something.  I can build a 2 GHz system with a full Gigabyte of memory for that kind of money.

Mistake #4: You forgot to think Tim Allen.

The one thing that drives the computer market these days sounds like a direct quote from Home Improvement: people want MORE POWER.  The market is saturated by the need for more powerful hardware, driven in part by the Windows OS and more to the point by the ever-larger, ever-more-impressive games out each day.  When someone is buying a computer, they want to get as much as they can for their money.

Enter the Mac theory: I can offer you three year old hardware on Pricewatch for the same price you'd pay for a brand new, state of the art (or at least mid-market) system from Dell.  Anybody notice the problem yet? That's right, the Mac prices itself out of the market for the majority of the market, and doubly so when it tries to compete on the same shelf as might happen at Best Buy or Circuit City. Simple economics leaves the Mac only to the wealthy or the clueless. 

Mistake #5: "Total User Experience" means something different to every user.

One of Steve Jobs' big ideas recently was the thought that he could tout the Mac as a "do-it-all" platform, one platform that delivers everything the user could want.  Unfortunately, there's a fundamental flaw in this logic too: he doesn't recognize that people buy computers for COMPLETELY DIFFERENT REASONS.

See that guy over there? He wants a computer that can run the hottest new games. He'll probably build it himself just to make sure it's all state of the art, and it'll stay with the same hardware for all of two months before he adds more memory, swaps out the processor, gets a new video card, more HD space, a new keyboard, bigger monitor, or something else. See that old lady? She'd be happy with a 300MHz computer that lets her send emails to her grandkids in Ohio.  The guy on the street corner?  He wants video editing.  The one walking into the office building? He's a graphic designer who wants to run Photoshop all day long.

And not a single one of them cares about what the other's needs are.  The video editor couldn't care less how well his computer took a scanner, but for the graphic designer it's essential.  Grandma doesn't need a color printer.  Half of them don't need a sound card, but video game guy wants a Dolby 6.1 system so he can hear every little noise in Return to Castle Wolfenstein.  Video editing guy might like the Mac's DVD authoring capabilities, but he might as well pay the extra money for the PC's $700 DVD-authoring drive. The others won't notice it. Grandma might like receiving videos from her kids, but let's face it here: the 30-45 minutes of video they send might as well have been put onto a CD.

This means that every one of them will buy a different computer, Steve. Not that all of them want to fork over the extra thousands of dollars for hardware they'll never use.

Conclusion: Mr. Jobs, take a look outside in the real world.

Sadly, the points I bring up aren't hard to see.  In fact, they're evident all over the place.  The easiest way to see it would be to walk into the neighborhood Best Buy, and see what they're selling.  Chances are, people are comparing prices on the hardware.  And unfortunately, the Apple computers won't be in the running for these people.

If you don't believe me, just take a look at other companies that tried to go it alone in their own industries.  The best example would be 3dfx, if you remember them.  They built a successful business on a model of selling their 3D chips off to board makers, then tried to go it alone when they took over STB and cut off the other manufacturers, who quickly saturated the market in NVidia chips.  3dfx didn't survive the change, just as Apple has been relegated to the back corner of the market after trying to go it alone when the Wintel platform expanded and expanded to cover the needs of users on a case-by-case basis.

If it goes on much longer, I fear Apple might just go the way of 3dfx.

Discuss this article in the forums or send your comments right here!

Go To: [ Home ]

 

Steve Jobs: Keeping Apple Small Time?


Added:  Thursday, January 17, 2002
Reviewer:  Michael Ahlf

 1  

[ Back to Articles index ]

Home :: Share Your Story
Site contents copyright Glide Underground.
Want to syndicate our news? Hook in to our RSS Feed.