2012-09-16

Do Androids Dream Of Electric Fruit?

What if Philip K Dick bad been born later, and wrote "Do Androids Dream Of Electric Sheep?" last year? Harrison Ford, Sean Young & Rutger Hauer in "Pad Runner"?


"They were afraid they'd develop features by themselves. So they built in a fail-safe."
"What's that?"
"Two year contract."

"You're walking in the hot sun and you see a Nokia."
"What's a Nokia?"
"You know what a brick is?"
"Sure."
"Same thing."

"Do you like our interface?"
"It's artificial?"
"Of course."
"Must be expensive."
"Very."

"Is this to be a transmission test? Measuring the so-called signal strength? Dial-failure of the keypad? Involuntary draining of the battery."
"We call it iOS for short."

"Have you ever retired an iPad by mistake?"

"I only do i's!"

"What model are you?"
"iPhone 6."
"I knew it."

"The pad that runs twice as fast lasts half as long. And you have run so very very quickly, Roy. Look at you! You're quite a prize."
"I want more range, fucker."

"I've seen things you fanbois wouldn't believe. Angry Birds on fire off the Ice Cream Sandwich. I've seen coin dozer glitter in the darkness near the Torvalds gate. All these upgrades are lost, in time ... like Flash ... in iOS. Time ... to buy."

"Too bad it won't run for ever! But then again, what does!"  

2012-04-18

My Laptop Bit Me - The Shocking Truth

Apple aren't the only company with products that can feel tingly or buzzy to a light touch, it is all due to the widespread use of switched-mode power supplies (SMPS) because they are lighter, smaller and more efficient than traditional transformer based supplies.

To simplify, an SMPS essentially takes the AC input and turns it on and off very quickly (between 10,000 and 20,000 times per second), with the output voltage being determined by the duty cycle - the amount of time it is on for each flipflop. The shorter it is on, the lower the voltage. This is fed into regulators and capacitors to produce a stable AC voltage, which is then converted into DC.

The switch circuits "float", bouncing from positive to negative & back, and to guide its timing the switch circuits (there's always more than one) base their timing on the input AC voltage frequency (in Australia, 50 Hz), which makes them very picky about the right phase the mains supply is. In order to 'float', the switch needs a baseline, so it relies on the common ground (sometimes called common rail) shared by the entire power supply.

The issue of "tingly" cases can be traced back to a mandatory pair of capacitors which bridge the Active and Neutral inputs to the common rail. Switch-mode supplies create a lot of spurious interference frequencies, and to prevent these from passing back into the mains supply (and cause havoc to other devices) 'decoupling' capacitors form a crude but effective low pass filter which shunts these unwanted noise-frequencies to the common ground.

When the device's common ground is connected to an earth, it is maintained at zero potential, and the EMI gets dumped. When the device's common ground is not connected to an earth, the common ground (which often includes the case, if it is metal) is energised to half the mains AC voltage, but at a very very high impedance. There's not enough current to harm you, nor the electronics inside the device being powered by the SMPS.

So what causes the tingling sensation? If you touch the case (and thus, common ground) you, dear human, become the earth -- current flows from the case into you, and through you into the atmosphere. Sure, its high resistance, but it is a conductive path that was not there before. Whilst the current levels flowing through you into the air borders on submicroscopic, you're being exposed to 120 volts. It's current that kills, not volts. If you touch a floating common ground lightly enough, and the atmospheric conditions are right, you'll feel each rise of the 50 Hz mains as the resistance between the tiny contact point of your skin's surface and the electrically-driven pressure (touch) nerve endings changes.

In short (ha) you should always use a triple-prong lead -- one with an earth pin and conductor -- between the power point and the SMPS, and plug it into a power point with a good earth. If your laptop's power supply only uses a two-pin 'donut' mains cable, you should invest in an aftermarket "universal laptop power supply". Alas, for MacBook Pro owners this is not an option thanks to their use of the custom 'MagSafe' power connector; users of Apple's laptops should make a point of using the full length mains cable that was shipped with their sleek metal slabs instead of the simple 'duck-head' snap in adaptor. Even though the MagSafe power supply itself is not connected to the mains earth, there is just enough inductance leakage for the floating charge to dissipate through the plastic sheath of the mains cable.

It's worth noting that the fuzzy-electronics effect can be felt if you use an earthed power point but the power point's Active and Neutral are wired back to front. Older houses and bodgy-job wiring can have them swapped, and for many decades it didn't matter. If you know your power supply has an earth pin and should be ground but you still get tingles and zaps, you have some serious mains wiring issue and should get a qualified electrician in to investigate as soon as possible.

2012-04-10

First-time-user boot trick for Mac OS X

So it is time to say goodbye to the Mac companion of the last year and upgrade to the newest. Old Faithful goes up for sale, but being the conscientious Mac user you are, you would like to hand it over with all the latest updates already installed. Normally this can be tricky to do and leave it set up so that when the new owner gets it they see the "Welcome" movie and can set things up their own way.

Here's what you need to do to achieve just that. Be warned, this is all text commands that are executed in OSX's single user mode – if the commandline scares you (as it should) then triple-check everything you are told to type (written in BOLD) before pressing the Return key.

First thing, of course, is to boot from your handy OSX installer, erase the hard drive, and install a fresh Mac OS. Endure the Welcome movie, make a new user account (call it 'Set Up'), log in and install all available updates either with Software Update or any stand-alone downloads that you had saved from last time. Once you have run Software Update and it tells you you are up to date, time to restart.

Be brave, my mouse-lovers! This won't hurt, I promise ...

(BTW: the '#' is the command prompt for single user mode. Type everything in after it, check your work, then press Return and go to the next step. If a command is long enough to wrap around to the next line in your browser, do not press Return, just keep typing. Remember, every new command starts at the #)

1 Press Command-S during startup to get into single user mode.

2 Check the filesystem:

# /sbin/fsck -fy

3 Mount the root partition as writable:

# /sbin/mount -uw /

4 Remove the hidden .AppleSetupDone file:

# rm /var/db/.AppleSetupDone

5 Reset the OS language choice.

# touch /var/db/.RunLanguageChooserToo

6a For Mac OS X 10.5 abd 10.6, do:

# launchctl load /System/Library/LaunchDaemons/com.apple.DirectoryServices.plist

For Mac OS X 10.7, do:

# launchctl load /System/Library/LaunchDaemons/com.apple.opendirectoryd.plist

Repeat for every user previously defined on the machine:

# dscl . -delete /Users/{username}

# dscl . -delete /Groups/admin GroupMembership {username}

6b For older versions of Mac OS X, do:

# rm -rf /var/db/netinfo/local.nidb

7 Remove the home directories of users. For every user do:

# rm -rf /Users/{username}

8 If applicable, remove already created files in root’s home directory,

# rm /root/.bash_history

9 Shutdown (or reboot to verify the procedure worked):

# shutdown -h now
-or-
# reboot

2012-04-08

Invincible Apple: Jobs 1 Geeks 0

(I don't know whether this article was inspired by my own "Curséd Are The Geek" posting or not, but it did appear shortly thereafter. As it is so absolutely spot-on and well written, I shamelessly add a copy of it here.)

Invincible Apple: 10 Lessons From the Coolest Company Anywhere

Everyone wants to be like Steve Jobs and his powerhouse company. It's not as easy as it looks.
by Farhad Manjoo, July 1, 2010


On Wednesday, May 26, 2010, just after 2:30 p.m., the unthinkable happened: Apple became the largest company in the tech universe, and, after ExxonMobil, the second largest in the nation. For months, its market capitalization had hovered just under that of Microsoft -- the giant that buried Apple and then saved it from almost certain demise with a $150 million investment in 1997. Now Microsoft gets in line with Google, Amazon, HTC, Nokia, and HP as companies that Apple seems bent on sidelining. The one-time underdog from Cupertino is the biggest music company in the world and soon may rule the market for e-books as well. What's next? Farming? Toothbrushes? Fixing the airline industry?

Right now, it seems as if Apple could do all that and more. The company's surge over the past few years has resembled a space-shuttle launch -- a series of rapid, tightly choreographed explosions that leave everyone dumbfounded and smiling. The whole thing has happened so quickly, and seemed so natural, that there has been little opportunity to understand what we have been witnessing.

The company, its leader, and its products have become cultural lingua franca. Dell wants to be the Apple for business; Zipcar the Apple for car sharing. Industries such as health care and clean energy search for their own Steve Jobs, while comedian Bill Maher says the government would be better run if the Apple CEO were head of state. (The Justice Department and FTC, which are both investigating Apple's tactics, might disagree.) A Minnesota Vikings fan dubs his team the "iTunes of quarterbacks," serially sampling one track from a player's career, as with Brett Favre, rather than buying the whole album as the Colts have done with Peyton Manning.

This shorthand is useful but tends to encourage a shallow notion of what it takes to emulate Apple. And Apple doesn't delineate the key factors of its success. Those principles are more closely guarded than its product pipeline. Jobs did not comment for this article. On-the-record comments from the CEO occur in only the most orchestrated environments (at MacWorld, say, or in newsweekly magazine stories timed to new product announcements), or in late-night email messages that defy explication. When it comes to the special sauce that makes his company the paragon of U.S. and global business, the CEO is silent.

How does one become the "Apple of [insert industry here]"? After speaking with former employees, current partners, and others who have watched Apple for many years, it's clear that the answers center around discipline, focus, long-term thinking, and a willingness to flout the rules that govern everybody else's business. It's an approach that's difficult to discern and tougher to imitate. But everyone wants to give it a try. Here, then, is our report on the Apple playbook. Short of something falling into your hands in a Bay Area bar, this may be as close to the truth about Apple as you're going to get.

1. Go Into Your Cave

If Steve Jobs were an architect, he'd work at the futuristic glass-and-steel San Francisco offices of international architecture and design firm Eight Inc. The walls are bathed in white, and the vibe is akin to working behind the Genius Bar. Here, on the second floor, look to the back wall. There you'll discover a frosted-glass door emblazoned with a white Apple logo. Behind it is Eight's Apple team -- a small group that has worked with the company since the late 1990s to conceive the look and feel of its "branded consumer experiences," which include its trade shows, high-impact product announcements, and 287 retail stores. The door is locked.

What goes on behind the locked door? "We really can't say too much," says Wilhelm Oehl, a principal designer, when I visit him one cloudy spring afternoon. He describes his work with Apple in only the vaguest, most anodyne terms -- to "redefine elegance," to keep an "integrity of design" that "makes the product the hero." Finally, Oehl mumbles, "We try to capture something that feels like magic."

These frosted-glass doors, and similar ones all around the world protecting other caves of Apple thinkers, are emblematic of Apple's fanaticism for secrecy. But those doors are more than mere paranoia. Apple sets its own agenda and tunes out the tech wags -- competitors, industry observers, analysts, bloggers, and journalists like myself -- who constantly spew torrents of advice, huzzahs, and brickbats in its direction. Behind its doors, Apple can ignore us all.

Jobs has never cared much about what the tech industry has to say. Back in the early 1980s, when he was leading the team building the Mac, Jobs would often give his engineers guidance on what the computer should look like. "Once, he saw a Cuisinart at Macy's that he thought looked incredibly great," says Andy Hertzfeld, one of the engineers on the original Mac team and the author of Revolution in the Valley: The Insanely Great Story of How the Mac Was Made. "And he had the designers change the Mac to look like that." Another time, he wanted it to look like a Porsche.

Get the picture? Computers should be more like sports cars and kitchen appliances. That's Apple's audience: high-end mainstream, the folks who buy -- or aspire to buy -- Porsches. You don't connect with those consumers by listening to Silicon Valley. Techies, even after all these years of Apple watching, still get bogged down in specs, speeds, and developer contracts. Magic doesn't happen in an echo chamber.

2. It's Okay to Be King

Mike Evangelist (yep, that's his name) still remembers one of his first meetings with Jobs. It took place in the Apple boardroom in early 2000, just a few months after Apple purchased the American division of Astarte, a German software company where Evangelist was an operations manager. Phil Schiller, Apple's longtime head of marketing, put Evangelist on a team charged with coming up with ideas for a DVD-burning program that Apple planned to release on high-end Macs -- an app that would later become iDVD.

"We had about three weeks to prepare," Evangelist says. He and another employee went to work creating beautiful mock-ups depicting the perfect interface for the new program. On the appointed day, Evangelist and the rest of the team gathered in the boardroom. They'd brought page after page of prototype screen shots showing the new program's various windows and menu options, along with paragraphs of documentation describing how the app would work.

"Then Steve comes in," Evangelist recalls. "He doesn't look at any of our work. He picks up a marker and goes over to the whiteboard. He draws a rectangle. 'Here's the new application,' he says. 'It's got one window. You drag your video into the window. Then you click the button that says burn. That's it. That's what we're going to make.' "

"We were dumbfounded," Evangelist says. This wasn't how product decisions were made at his old company. Indeed, this isn't how products are planned anywhere else in the industry.

The tech business believes in inclusive, bottom-up, wisdom-of-crowds innovation. The more latitude extended, the greater the next great thing will be. Nowhere is this ethos more celebrated than at Google, where employees are free to spend some of their working hours building anything that strikes their fancy. A few of these so-called 20%-time projects have become hits for Google, including Gmail and Google News.

Apple's engineers spend 100% of their time making products planned by a small club of senior managers -- and sometimes entirely by Jobs himself. The CEO appoints himself the de facto product manager for every important release; Jobs usually meets with the teams working on these new gadgets and apps once a week, and he puts their creations through the paces. "He gets very passionate," Evangelist says. "He'll say, 'This is shit, we can do much better.' "

How can it be wise for so few people to have the authority -- not to mention the time -- to make most of the creative decisions at a company as large as Apple? Bottlenecks do result. According to one former Apple engineer, a staff of about 10 "human interface" designers is in charge of the entire Mac operating system. With such a small group making decisions, Apple can put out only one or two new products a year.

But this approach works because Jobs and his team know exactly what they want. A more decentralized company like Google may launch dozens of products a year, but more of them fail. (Have you Waved much lately?) Apple hits for a high average. And Apple's strong management keeps the troops focused. "Everybody knows what the plan is," says Glenn Reid, a former Apple engineer who created iMovie and worked on several other iLife apps. "There's very little infighting."

"I still have the slides I prepared for that meeting, and they're ridiculous in their complexity," Evangelist says, remembering how everyone in the room understood, immediately, that Jobs's rectangle was right. "All this other stuff was completely in the way."

3. Transcend Orthodoxy

A battle rages in the tech industry, fought on the side of "good" by those who believe that software should be "open" -- in other words, accessible to developers of all stripes -- and on the other by misanthropes who feel that it's fine to limit development. Techies generally believe that open is not only trendy but virtuous. Google trumpets that its Android phone is more open than the iPhone. Adobe brags that because its software tools help developers create write-once, run-anywhere software, it is the epitome of openness. Apple counters that it wants to replace Adobe's proprietary Flash with HTML5 and H.264, which are actually open Internet standards. Nonetheless, Apple is perceived as being closed. Cory Doctorow, author and co-editor of the widely noted tech blog Boing Boing, distilled the anti-Apple argument into a single line: "If you want to live in the creative universe where anyone with a cool idea can make it and give it to you to run on your hardware, the iPad isn't for you."

This argument may not engage you, and perhaps you even find it boring. That makes you just like Apple. Despite all the noise about Apple's closed ideology, the company adopts positions based on whether they make for good products and good business: You know, like a results-focused company, not a dogmatic college philosophy major. For example, Apple happily accepted the music industry's copy-protection requirements because they helped it successfully launch the iTunes store. When they no longer made business sense, it dropped them.

For Apple, the ideas of closed and free aren't in conflict. "We're just doing what we can to try and make [and preserve] the user experience we envision," Jobs emailed Gawker blogger Ryan Tate, who had baited the CEO in the wake of Apple's decision to ban Flash from the iPhone and iPad. "You can disagree with us, but our motives are pure." The App Store, Jobs wrote Tate, offers "freedom from programs that steal your private data. Freedom from programs that trash your battery. Freedom from porn. Yep, freedom."

Developers have griped loudly that the App Store is closed because it dictates how apps get built. But that's misleading: The problem isn't that it's closed, but that its rules are arbitrary, hidden, and frequently changing. If Apple embraced transparency, it could avoid much of this debate. But fundamentally, who really cares about the verbiage? While the bloggers rage on, the App Store is a total success, and even its fiercest foes admit that it offers a dead-easy, totally fun way to find useful things to soup up your phone and tablet. For Apple, that's the only philosophy that matters.

4. Just Say No

The new MacBook Touch is bendable. Its single OLED screen features a flexible seam, allowing the machine to function as a laptop, a 13-inch tablet, or even a desktop, depending on how you flex it. The computer has half a dozen peripheral ports, includes a stylus, and comes in two colors. And, I should add, it doesn't exist. It was designed by Tommaso Gecchelin, a student in Venice, Italy, who is unaffiliated with Apple, but is one of a growing subculture of people around the globe who create and share concept designs of the Apple products they'd like to see.

Although many of these illustrated fantasies are quite beautiful, and some are uncannily realistic, their fatal flaw is often the same. They're larded with features. Apple is about less (those six ports on the MacBook Touch should have been a dead giveaway that this wasn't an Apple product). Even Gecchelin concedes, "This is not the Apple philosophy."

Jobs's primary role at Apple is to turn things down. "He's a filter," says the Mac engineer Hertzfeld. Every day, the CEO is presented with ideas for new products and new features within existing ones. The default answer is no. Every engineer who has gone over a product with him has a story about how quickly Jobs reaches for the delete key. "I'm as proud of the products that we have not done as the ones we have done," Jobs told an interviewer in 2004.

It's not just Jobs's consistent aversion to complexity that prompts him to say no. Apple thrives on high profit margins, and having the willpower to say no keeps production costs down. Eliminating features also helps build buzz. "The great thing about omitting a feature that people want is that then they start clamoring for it," says Reid, the former Apple engineer. "When you give it to them in the next version, they're even happier somehow." Apple has pulled off this trick time and again, most recently with the iPhone OS 4. It includes multitasking, a feature that customers began asking for in 2007, intensifying their pleas after Palm debuted multitasking in its WebOS last year.

How could the iPhone not have something this elemental until its fourth generation? Or take the iPad: Really, no camera? In 2010? Even the iPad-adept 2-and-a-half-year-old girl in the YouTube video complained about it. Come on, Apple, what are you thinking?

Maybe it's thinking of a reason for you to come back next year.

5. Serve Your Customer. No, Really

Among the many angry customers whom Jeremy Derr encountered during his time as an Apple Genius, the one he remembers best is the professional photographer with the bad FireWire port. "This guy had been dealing with the issue for weeks, so by the time he came in, he was pretty distraught," says Derr, who began working as a Genius at Apple's Houston Galleria store in 2002. Derr determined that the machine would need to go in for service and the repair would take a week. "That's when he absolutely lost it."

However great your product, something will invariably go wrong -- and as the classic customer-service maxim goes, only then will the customer take the true measure of your firm. In recent years, companies of all kinds -- but especially Apple's competitors in the computer and phone businesses -- have adopted strategies that amount to customer avoidance rather than service. They shunt their customers off to outsourced call centers staffed with underpaid agents who read from scripts, or worse, send them to an online FAQ. When Google launched its Nexus One smartphone through its online store in January, it forgot to make any real people available to field support questions. It didn't take long for the company's online forums to be flooded with angry customers.

When Apple devised its retail strategy a decade ago, the company had a single overriding goal: to launch stores that were unlike anything that customers associated with the computer industry. Apple hired Ron Johnson from Target and George Blankenship from Gap. (Last year, Blankenship decamped to Microsoft's new retail-store effort.) Johnson began by asking shoppers to name their best customer-service experience, and he found that most of them agreed on a single setting, the hotel concierge desk. Their effort to re-create the same friendliness you'd find in a Four Seasons Hotel lobby led to the Genius Bar, which Johnson calls the "heart and soul" of every Apple Store.

Geniuses will look at any Apple product for free, regardless of where you bought your item. They'll take a stab at fixing non-Apple software, and they'll even help customers with non-tech-support tasks. "I once helped a woman learn iMovie so she could record her wedding reception," Derr says.

Apple doesn't charge for any of this. Customers pay only for repairs on out-of-warranty goods, and Derr notes that Geniuses have almost total leeway to waive these fees. How can Apple afford to be so generous? "It's a loss leader," says Derr, who left the Apple Store in 2006 to start a software company. "Sometimes someone comes in for help and decides to buy something on the way out."

That's exactly what happened with Derr's angry photographer. As the man ranted about being unable to do without his computer, Derr suggested that perhaps he should invest in another laptop as a backup. "It was like I'd said the magic words," Derr says. The photographer left the store with a brand-new machine.

6. Everything Is Marketing

Just as the Genius Bar has proved to be genius, the now-classic Apple slogan "Think Different" also turns out to be more than just words: The brains of Apple fans really are different. When Martin Lindstrom, a brand consultant and author of Buyology: The Truth and Lies About Why We Buy, examined those brains under a functional magnetic-resonance-imaging scanner, he discovered that Apple devotees are indistinguishable from those committed to Jesus. "Apple's brand is so powerful that for some people it's just like a true religion," Lindstrom says.

Apple cultivates religious fervor among its adherents in a number of subtle ways, including its mysteriousness and its suggestion that customers are among the chosen ones. Perhaps most important, though, is Apple's devotion to symbology. Its most effective marketing efforts, Lindstrom says, are built into the products themselves. Think of the iPod's white earbuds, the Mac's startup sound, or the unmistakable shape of the MacBook's back panel. None of these choices were accidental. Apple understands the lasting power of sensory cues, and it goes out of its way to infuse everything it makes with memorable ideas that scream its brand.

This extends to the fanatic attention to detail that Apple brings to its biggest product launches. These usually commence after months, possibly years, of rumors (we'd been hearing about an Apple tablet since 2002). The actual launch day is choreographed like a dictator's display of military splendor. One example: Apple buys up all the bus-stop ad space near the Yerba Buena Center for the Arts, the San Francisco venue where it has held its recent events. It then switches its posters while Jobs is speaking. So this past January 27, when I walked into Apple's iPad debut, the street ads depicted something old; when I left, there's the iPad everywhere you look. Study the iPad in the poster and its clock says 9:41 a.m. Why? Apple thought of that, too. That's the exact moment that Jobs revealed the iPad to the world. Somewhere, Kim Jong-il is smiling. Who else but Apple orchestrates its branding to this nth degree?

There may be a limit to the value of Apple's increasing cultural ubiquity. The company risks a Starbucksian-level backlash. This, Lindstrom says, is Apple's main branding problem today. Once we're all members of the church of Apple, will we all keep praying together? Or will the pioneers strike out in search of something less common, the next insanely great thing?

7. Kill the Past

Don't be surprised if Apple someday unveils a "desk-free" computer -- a machine that lets you slump on the couch with a wireless keyboard while surfing on a giant projected screen. Or a surface that can recognize handwriting gestures, in order to let you sign your name on a touch screen without using a stylus. There may also be a bright future in three-dimensional computing. Instead of fussing with flat windows on your iMac, cubes, prisms, and pyramids would represent apps, and you'd rotate one in 3-D space to interact with different parts of the program.

More fanboy hallucinations? Nope. They're all mentioned in recent Apple patent filings. We may never see any of these products, but no other company reimagines the fundamental parts of its business as frequently, and with as much gusto, as Apple does. In just the past few years, for instance, we saw the company remake its entire line of notebook computers by instituting a "unibody" production process. Now its computers are laser-cut out of a single slab of aluminum or polycarbonate plastic, a dramatic shift from the way the industry has made portables since their inception.

Apple disregards the entire concept of backward compatibility, which is both a blessing and a curse for rivals such as Microsoft. Over its history, Apple has adopted new operating systems and underlying chip architectures several times -- decisions that rendered its installed base instantly obsolete. Jobs killed the floppy disk in the iMac, and he claimed that optical drives were on their way out with the MacBook Air. Now, with the company's embrace of touch screens, Apple seems to be gunning for the mouse, a technology that it helped bring into wide use in the 1980s. Does this relentless eye toward the future always work? No. Jobs killed the arrow keys on the first Mac; Apple was forced to add them back in a later version, and it has kept them in all its Macs ever since.

More often, though, Apple's willingness to abandon the past makes for better products. Nothing holds it back, so it can always stay on the edge of what's technologically possible. Plus, the strategy forces the faithful to keep buying new versions. One Apple customer recently emailed Jobs to ask whether Apple would continue to support the first iPhone, which launched in 2007. Jobs's response: "Sorry, no."

8. Turn Feedback Into Inspiration

Steve Jobs has often cited this quote from Henry Ford: "If I'd have asked customers what they wanted, they would have told me, 'A faster horse!' "

This is Jobs's defense of Apple's reluctance to listen to even its most passionate customers, and the line is a good one to remember the next time you're considering a new round of focus groups. "The whole approach of the company is that people can't really envision what they want," says Reid. "They'll tell you a bunch of stuff they want. Then if you build it, it turns out that's not right. It's hard to visualize things that don't exist."

But Jobs doesn't exactly ignore customers; he uses their ideas as inspiration, not direction; as a means, not an end. Ever since the netbook boom began, many people have begged Apple to put out its own. These tiny, ultra-portable machines represented the fastest-growing segment of the PC business, and the company seemed to be missing out. Some people (yours truly included) even went so far as to hack PC netbooks in order to run the Mac OS. Jobs could not have been more dismissive. "We don't know how to make a $500 computer that's not a piece of junk," he said of the prospect of an Apple netbook.

Cut to January 2010, and there's Jobs unveiling a $500 computer that isn't a piece of junk. But the iPad isn't a netbook. It's both more, and less -- not just a faster horse.

9. Don't Invent, Reinvent

"Revolutionary" is one of Jobs's favorite words. When he revealed the iPhone, he said, "Today, we are introducing three revolutionary products" (the punch line being that he debuted just one device with the power of three). Three years later, he introduced the iPad by saying, "We want to kick off 2010 by introducing a magical and revolutionary product." He's been doing this a long time: In 1989, he introduced the Next computer as the "next computing revolution."

Revolutionary is a word that drives his critics batty. Jobs touts each creation as unique and original. Detractors insist that they all borrow freely from preexisting technologies. And it's hard to argue, given that music players existed well before the iPod, and smartphones predate the iPhone. Some of those critics, most recently Nokia and HTC, have taken Apple to court for patent infringement, a charge that Apple is quite familiar with, having settled suits leveled against it relating to the iPod (paying $100 million to portable media maker Creative Technology) and the iPhone (Klausner Technologies, a patent holding firm, had a patent on visual voice mail).

This all depends on what your definition of revolutionary is. Apple's talent is far more cunning and more profitable than mere infringement. To use a musical analogy, Apple's specialty is the remix. It curates the best ideas bubbling up around the tech world and makes them its own. It's also a great fixer, improving on everything that's wrong with other similar products on the shelves. (One of the underrated joys of a Jobs product demo is the trash talking about what everyone else in the market doesn't understand.)

The iPad is a perfect example. Much of it has been done before; Bill Gates demonstrated a Windows-based tablet in 2001, and he predicted that it would become the dominant computing format in five years' time. Windows tablets flopped immediately. Why? First, Microsoft lamely re-created the desktop's interface, and it required users to deal with a clunky stylus to get anything done. Gates also didn't encourage developers to create tablet-specific apps. Indeed, as Dick Brass, a former Microsoft executive, wrote in The New York Times last February, Microsoft's own Office team refused to modify the productivity suite for tablet computing.

Jobs saw that Apple could fix all these issues. The operating system: Apple had solved that problem, to great acclaim, with the iPhone. Interface: The iPhone's multi-touch did away with the need for a stylus. Apps: The App Store had already proved remarkably capable of encouraging developers to create programs for a new gadget. All that, plus a lot of thinking about design and marketing, and voilà! A tablet that the whole world finally wanted. Was the iPad truly a "new" device? Does it even matter? Apple sold 2 million of them in the first 60 days.

10. Play by Your Own Clock

A few weeks after the iPad hit the shelves, word leaked that HP had decided to delay and retool the Slate, the tablet PC that it had promised would rival Apple's "Jesus tablet." The same day, Gizmodo reported that Microsoft had killed the Courier, another reputed iPad killer. Research in Motion, too, has delayed its planned tablet until 2011.

From what we saw and heard of these devices, they were more complicated than the iPad -- full-blown computers in tablet form rather than the streamlined iPad. Caught off guard by the market response, these rivals realized that they'd be releasing their version of a faster horse. They went back to the drawing board. Meanwhile, other Apple rivals, including Google, British Telecom, and Intel, are now scrambling to enter the tablet game.

Apple doesn't get caught up in this competitive frenzy (perhaps because it's working behind those locked doors). It plays by its own clock. Apple's release schedule is designed around its own strategy and its own determination of what products will advance the company's long-term goals. It can do this, in part, because of Jobs's exalted position among chief executives. The average American CEO's tenure is about six years, and it's steadily declining. Many CEOs are just a couple of consecutive bad quarters away from pink slips. Jobs knows he's never going to get fired, so he's liberated to devote years -- if that's what it takes -- to attain Apple's high standards and hit the fat part of the adoption curve. Most CEOs aren't so lucky.

The company's long-range focus allows it to do something much more sophisticated as well: build the future into its current products. For the past decade, the company has released a series of platforms -- Mac OS X, the iPhone OS, iTunes, its retail stores, the App Store, and recently its own microprocessors and iAd, a mobile-advertising system -- that give it a stepping stone to its next products. The iPad is the culmination of all these things. Its glass screen, interface, unibody construction, operating system, and App Store all originated in other Apple products. Within the iPad are clues to Apple's future gadgets and services, though we'll only be able to spot them in retrospect.

Of all the points we've covered here, Apple's willingness to go long is perhaps its greatest strength. The company has a plan. It's on the right path, and that fuels both confidence and grand ambitions. It's executing, to say the least. Which is why the "Apple of American business" is, well ... Apple.


(Originally found at http://www.fastcompany.com/magazine/147/apple-nation.html?page=0%2C0)
(Copyright Farhad Manjoo and Fast Company 2010, used without permission whatsoever. Go visit, they be cool.)

2011-12-24

Has The Soft-synth Killed Synthesis?

1960: In The Beginning, Bob Created The Filter

If you don't know who Bob Moog is, you don't deserve to own a synth (virtual or otherwise) and should leave the internet, find a Strat and a monkey with a drumkit and stick to doing covers of Elvis or The Jams – I hear there's good money to be made on the cabaret circuit :)

Even though Bob had some musical background, his first circuits were never intended to be used as musical instruments, but when a friend heard the first tones and warbles and commented that it sounded like something from the movie 'Forbidden Planet' he changed his mind, figured out how to add an old organ keyboard, and the rest is history. Musical academics soon queued up for one, then the musique concrete crowd, then prog rockers started calling. Walter Carlos' synthetic orchestrations for 'A Clockwork Orange' made the Moog a household name, and Keith Emerson showed what it could do for rock, which led Bob to taking a few key circuits from his wall-filling Series 900 boxes and creating the Minimoog.

1970: Synthetic Cows And The Polybeast - Thanks For The Memories

Seeing the Minimoog being used as a successful live instrument enticed others to make something, such as the ARP Odyssey, the EMS Synthi and Tom Oberheim's SEM – a synth-in-a-box meant to be plugged into a Minimoog but which ended up with a keyboard of its own. The synth escaped the realm of prog-rock and was soon found in classical performance, jazz, and pop-music, as well as creating its own horrendous sub-genre of novelty albums full of cover-tunes played on synths and inevitably using "Moog" in the title even if the synths were from other makers. Thanks to pulp-musicians like Hugo Montenegro, we now know that plastic cows say "moog".

I know. Sad, isn't it.

Despite this recognition, synths were like a saxophone in that they could only make one note or sound at once, and there was no way to save a particular arrangement of knobs & switches except by human memory alone. Some people used patch-sheets to draw knob positions, some people took Polaroid photos of the front panel, and Rick Wakeman would glue the knobs of his Minimoog into place once he got a sound he liked, and pull another Minimoog out of his collection. Moog tried to solve the first problem with their Polymoog, but it was closer to an organ than a synth, so that's why Yamaha's CS50 released a year after in 1976 is considered the first true polyphonic synth. Yamaha's attempt to solve the second problem of remembering sounds was odd – on the CS50's big brother, the massive GX1, there were four boxes on the top, each holding what was essentially a miniaturised but complete copy of the front panel.

The big breakthrough that solved both problems came in 1978 with Sequential Circuits' Prophet 5 – digitally controlled analogue oscillators and filters that could handle a five-finger chord coupled with a small CPU and battery-backed RAM that could save every control's setting in ten (and shortly after, one hundred) patch memories.

1980: The First Rise & Fall Of Originality

As the 80's progressed, the focus on electronic synthesis shifted away from rich prog-rockers and academics to English punk bands, which led to wide acceptance by pop & rock bands once tracks like "Fade To Grey" by Visage and New Order's "Blue Monday" became top-ten hits. Aspiring bands the world over wanted synth-players as well as drummers and bassists and guitarists, and several American and Japanese companies soon had Prophets and Odysseys and Jupiters next to the Fender guitars and the Zyldjian cymbals. One corner of the shop would have staff yelling at guitarists to stop playing "Stairway To Heaven", the other had staff yelling at keyboardists to stop playing "Jump" and "Oxygene".

As more and more broadcast music had synthetic content in some form, synth-nerds would often play synth-spotting – instruments from particular makers often had a recognisable character, which was pretty much the only clues we had because every synth-player made their own sounds. Many synths of the day had no patch-memory, or at least very few, and those with patch memory often had all of the factory sounds overwritten. Watching music videos, many of the synth-players had one hand on the keyboard and the other flying around the controls, twirling knobs and flicking switches as they played. Brian Eno when he was playing with Roxy Music was a good example, as was Manfred Mann in the clip for "Blinded By The Light". At the other end of the spectrum, successful synthesists would be seen with the $30,000 Fairlight (Herbie Hancock's "Rock It") or the $100,000 Synclavier (Tony Clark in Dire Straits' "Money For Nothing").

Then the music died.

On November 22nd 1983, the market's first affordable all-digital synthesiser was launched – the Yamaha DX-7.

Unlike every other synth before it, the DX-7 did away with analogue-based subtractive synthesis in favour of FM (frequency modulation) synthesis, which made for incredibly accurate bell, chime, brass, percussion and piano sounds. It also had 32 slots of patch memory, and in another first, had a cartridge slot where more patches could be stored. Compared to the twenty or so knobs and faders analogue synths usually had, the DX-7 had over 180 separate parameters that combined to make up a single patch or sound ... but they were all buried behind a 2x16 character display and one – count 'em, one – fader. The range of sounds the DX could produce was huge, and it could mimic real instruments much better than anything analogue ever could. It also had, compared to its competition, a ridiculously low price. The DX-7 sold like hotcakes. Air-play became full of bands that sounded a lot like each other, and in almost every one of them the DX-7 'E.Piano2' or 'Marimba' or 'Bell1' factory patches could be heard.

The other two big Japanese synth-makers, Korg and Roland, quickly realised that by adopting digital techniques the price of making the actual hardware of a synth plummeted dramatically. Roland introduced Linear Arithmetic synthesis with over four hundred parameters per patch, Korg went for a mix of tiny sampled waveforms from real instruments and processor control of analogue filter-chips (with up to 200 parameters per patch, depending on model), whilst newcomer Kawai brought cheap additive synthesis to the market, once the domain of supersynths like the Synclavier. These synths all had the same over-all characteristics of a bland front panel, an LCD display, never more than four physical knobs or faders, a depressingly small keypad, a card or cartridge slot, and MIDI jacks. Thanks to MIDI, they could even dispose of the keyboard, and the rack-mount synth module was born. The big Americans, Sequential, Moog and Oberheim, stuck mostly with their analogue synths and tried to increase flexibility whilst keeping as many knobs as possible, but the cost of all those physical controls couldn't let them compete with the Japanese. Even after adopting the same approach, machines like the Oberheim Matrix1000, the Prophet VS and Moog Source could not save the companies.

This made creating new sounds a major chore for most musicians – gone was the ability to reach out and tweak a knob to get an instant change in the sound, instead you had to know your synthesis and know your synth, press a few buttons to reach the parameter you wanted, then press a few more buttons or move the data entry knob to the new value, and then play the keyboard. None of the early digital synths let you change the sound in real-time, you couldn't hold a key down and change a parameter's value and hear what the difference was in the sound without letting go and pressing the key again.

A new market sprang up almost overnight – pre-programmed cartridges full of patches made by other people. Personal computers of the day – Apple II's, Commodore C64's and Ataris – grew MIDI interfaces and patch librarian programs. Soon you could buy a thousand new sounds on a single floppy disk. No-one bought synths to make their own unique sounds any more, unless you were one of the very few anal-retentives who made sounds to sell.

Sampling – recording a sound and then altering the playback speed of the sound to change pitch – had been around as long as the personal computer had been, but as a stand-alone musical instrument with good audio quality you had to spend big bucks on a Fairlight or Synclavier.

An offshoot of cheap digitally controlled synthesis was the introduction of affordable hardware based samplers, such as Emu Systems' Emulator, the Ensoniq Mirage, and the Akai S1000. The patch-library companies were soon adding disks full of samples to their catalogues, but it was Akai's decision to add a CD-ROM drive peripheral that made it become the studio choice, and cemented the Akai sample-format as an industry standard. The Aussies continued to battle on, but the sample-driven Fairlight could not compete against machines one twentieth of the price, and the Synclavier hung on for a few more years as a high end machine thanks to a few huge hard-drive-only sampler additions and the Denny Yaeger String Library, and only because it was cheaper to hire a Synclavier than it was to get the London Symphony Orchestra in to do your film-score.

If you couldn't compete on price, you soon died. Fairlight got out of the music game in 1989, and New England Digital folded completely in 1993. Sequential Circuits was bought up by Yamaha, Oberheim got bought by Gibson and stopped making synths, Voyetra and many other smaller makers died complete, Kawai faded into obscurity, and Bob Moog hung on by the skin of his teeth making expensive theremins. By 1992, eighty percent of all synths sold bore a badge reading Roland, Yamaha, or Korg, and every device had that telltale single display, single control-fader and a grillion parameters to edit.

1990: A Virtual Birth

In 1995, a small Swedish company took a couple of the world's fastest DSP chips, a handful of cheap knob-workalikes taken from computer mice, and some very clever software that could mathematically replicate analogue circuitry, put it all into a box, added a keyboard, and painted it red. Clavia's Nord Lead was the first "virtual analogue" synth, with the mission of recreating not only the warmth and character of the old analogue synths of yesteryear, but the visceral hands-on joy of real-time knob tweakery. The Nord became the most-sought synth for both reasons, and prompted some other companies to release similar knob-infested synths such as the Alesis A6 Andromeda, Viscount's Oberheim OB12 and the Waldorf Q. A German newcomer, Access, had a huge hit with their knobbly Virus, and the Japs soon hopped on board with Roland's JP-8000, Yamaha's AN1x and Korg's MS2000 all vying for the gigging synthesist's attention (and money).

By 2002, it was hard to buy a synth which was not covered in knobs – interactive sound creation was once again in the hands of the musician, and the patch-sellers and sample-sellers started to feel the pinch … for a while.

2000: Komputerwelt

It is around this time two things happened that revolutionised the music business – a massive jump in processing power in affordable personal computers (Apple's PowerMacintosh G3 was the first personal computer to earn a supercomputer classification from the Pentagon!) and massive global adoption and use of the internet.

Whilst it's true that the use of computers in music-making has a long history – the Australian computer "CSIRAC" was the first to play music directly in 1951 – the realm of computer-generated audio synthesis had been a specialty area for some time, and there were no real standards. If you weren't using a dedicated synth-chip on a sound-card, then you needed some impressive CPU grunt to math out the waveforms in something resembling realtime. Early soft-synth programs such as ReBirth, Koblo and Metasynth could easily max out a state of the art PC or Mac if you tried to do a lot at once, and you could forget trying to run more than one synth at a time, too.

Enter Digidesign and Steinberg, two companies that had been in the audio industry since 1984, with almost identical competing products, Pro Tools and Cubase. Both products started off as MIDI-only sequencers at almost the same time, both added realtime audio multitracking at around the same time, and both came up with the concept of a modular plug-in architecture at around the same time. Pro Tools was the professional studio's choice more often than not, because Pro Tools relied heavily on Digidesign's own high end hardware. Cubase on the other hand could make use of audio hardware from many different makers, thanks to the ASIO standard Steinberg developed, and was more often found in semi-pro and hobbyist studios.

Digidesign's concept for plug-ins was quite clever – they were to be written in such a way that they can be run either directly in the host computer's CPU or loaded into a PCI card full of DSP chips. This was great for pro studios who could afford to add four or more 'DSP farms' to a PC or Mac and run a dozen convolution reverbs, twenty mastering limiters and a few synths & other effects on a 100 MHz PowerPC or Pentium. Not great if you've spent all your money & got the fastest computer at the shop and still be limited to three synths and three effects plug-ins before the audio output starts to stutter.

Steinberg decided to make their plug-ins CPU-dependent, but opened the spec, knowing that computing power was on the rise and eventually there would be cheap computers powerful enough to run dozens of the most complex bits of software ever devised without breathing hard.

2005: The Second Rise & Fall Of Originality

Steinberg's double whammy of ASIO and VST standards soon meant that to recreate a full electronic music studio, all you needed was a decent computer and a decent sound card. However, because everything was happening on a computer screen, everything was controlled by proxy. For some users, having to edit each control one at a time using a computer mouse was about as enjoyable as creating a new DX-7 patch from scratch. Having to shift from left-brain creative mode to right-brain logical operate-and-control mode constantly was, they said, the fastest way to kill the creation process and make any recording session one long act of frustration.

Music-making is, after all, an intimately interactive experience, and anything which adds time to the idea of a note in your brain from triggering your fingers and that note reaching your ears is a bad thing. In fact, no matter what the task at hand, the time that encompasses thought-action-reaction needs to be kept as short as possible, preferably less than 200 milliseconds. Interactivity is key, and the interface between you, your thoughts, and the desired outcome from the device you are manipulating must be as intuitive to use as possible. Apple know this, and it's why the iPhone became the most-sought-after and most-emulated smartphone less than six months after launch.

This is also why having lots of physical controls directly under your fingertips is so vital to a smooth flow of creative juices, and is why we have a healthy industry making control-surfaces that talk MIDI and/or USB and have all manner and number of knobs, faders, buttons, pads, sticks and other controls that can be mapped to a virtual synth or effect's controls and thus be recordable. Many computer musicians will be running a dozen or more virtual instruments and effects in a single session, and it is frustrating to have to remember to change a control-surface's internal program to match the front-most plug-in. To reduce some of that frustration, smart controllers like Mackie's Universal Control system or Novation's ReMote-SL stuff can help a lot by watching the screen for you and changing their settings to whatever VST you have focused on.

For some, even this is not enough. Still too slow. Still not enough feedback.

Alas, this is extra hardware which requires extra money, and a lot of budding bedroom musicians have little or no money to spend on such things – in fact, a large proportion of computer musicians didn't even pay for the software they use thanks to the pervasive internet and ease in which illegitimate copies of software can be downloaded and used without worry. Because of the frustration factor I describe above, most users never bother to get minutely involved with controlling their plug-ins and once again rely solely on presets.

2010: Plug In, Browse On, Vague Out

So, the present-day answer to the original question – has the softsynth killed synthesis – the answer is no. Not directly, anyway. So what did kill the art of synthesis? Partly tight-arsed penny pinching (including outright theft), partly intrinsic human laziness, but mostly the reduction in interactivity – once again each aspect of a sound is removed from direct, instant-feedback interaction.

Once again we are seeing presets collections being offered for sale and/or download. We are also seeing a massive increase in the number and size of sample libraries. Unfortunately, neither of these helps creativity, and in fact hinders it to a very large extent – we become buried in a morass of patches, programs, plug-ins and samples to choose from, and flicking through the hundreds and thousands of sounds for that one special thing makes us forget what we're looking for. Jean Michel Jarre sums it up rather eloquently:

"Digital technology took electronic music down a blind alley. Musicians were compelled to work in an increasingly cerebral and abstract fashion on samples of existing sound rather than creating original sounds as they had with the first wave of synthesisers.

"The excitement of being able to work on sounds in a tactile, manual, almost sensual way is what drew me to electronic music in the first place. The lack of limitations is very dangerous. It is like the difference for a painter of getting four tubes with four main colours or being in front of a computer with two million colours. You have to scan the two million colours and when you arrive to the last one you have obviously forgotten the first one. In the Eighties we became archivists and everything became rather cold as a result."


Noted composer and synthesist-non-pareil Evangelos Papathanassiou – better known as Vangelis – has almost identical thoughts:

"The way music technology has been designed today, it takes you away from the spontaneity and from the human touch, because you have to follow the machines, and program. I'm not doing that. Everything I do is not pre-programmed, it is done on the spot. One take. Everything is done live, never programmed.

"Comparing the technology I have today with what I had when I did Chariots Of Fire, there is nothing. I have a wider choice of sound, and the sound quality is better. But the system I use is exactly the same, I never change the approach, it is always live. All I've changed is the source of sound, that's all.

"The playability of modern synthesisers is a big problem, you have to use computers, you have to program, you have to choose, and these kinds of things ... I can't do it that way, so I use my own system to access the sounds, to bypass this difficulty, so that it is instant, immediate.

"As you create sound textures and qualities, so you create the composition."


Less Is More

So what's the solution?

First thing to tick off the checklist: Get a good controller with plenty of knobs and/or faders and MIDI keyboard. If they are both in the same box, even better – plenty to choose from but you can't go wrong with the Novation ReMote SL25. Until you do, you're not making music. If you're determined to stick with a computer keyboard and a mouse and nothing else, you're not a musician. You're a a bit-shuffler, a tinkerer, you don't even rate as an audio sculptor. Chimpanzees can make better rhythms slapping the ground bare-handed than you could by clicking beat-grids in FL, a whale's fart has a better melodic hook than your quantised mouse-drags on a Cubase piano-scroll.

Second is to stop being an archivist. Stop wasting time downloading every VST, patch-set and sample library you see a link to. He who dies with the biggest collection of plugs & samples is the biggest loser of them all. Cure yourself of versionitis, that compulsive affliction to make sure you have the uber-latest version of everything – the world isn't going to end just because you're on version 1.02 when everyone else is using version 4. Newer is not automatically better.

Third: Reduce your palette. One DAW package (can't go wrong with Live, but Tracktion and REAPER also have good interfaces). No more than six soft-synths, one sampler. One delay, one or two reverbs, one compressor or comp/limiter, a gater if you're into dance music. Up to four "weird" sound-manglers. Once you've settled on your plug-ins (any more than ten in total defeats the purpose) delete the rest. That's right, delete them, get rid of them, at the very least dump them to DVD or spare hard drive and leave the backups with a friend or family.

Finally, learn your VSTs backwards. Experiment with every control, every value, learn where the sonic texture boundaries lie. Set up your control-surface so that controls for similar functions across VSTs are in the same location on the control-surface. The aim is to be able to reach out and tweak a control to move closer to the sound in your head without having to think about it – it needs to be intuitive, something you can do almost by instinct. Once you think you're getting a good grip on what a VST can or can't do, delete all the presets it came with. Once you have mastered your less-than-a-dozen instruments and effects, then and only then can you think about adding another synth or effect to the collection.

Claude Monet painted 'Still-Life with Anemones' with eight individual pigments. Leonardo use fourteen on 'Mona Lisa'. Like a good painter, you need to devote time on mixing your palette of available colours and textures together, learning what works and what doesn't. You'll find that with less, you can do more than you thought possible. A lot more.

iPad, You Pad, We All Pad - Let Your Fingers Do The Tweaking

Over the decades, academic musicians such as Steve Vorhaus, Tod Machover and Don Buchla have focused on different methods of interacting to control sound in a performance, and occasionally something clicked and actually made it to the market – things like the ribbon controller, Yamaha and Akai's breath controller, and the Chapman Stick. Toshio Iwai created the odd multibutton "Tenori-On" in 2005, which Yamaha brought to market in 2007 and which spawned the Monome, a low cost, easily modifiable button-grid controller that could be put to all sorts of uses.

The last few years has seen rapid development in the use of touch-screen technology, the two most notable examples to have reached the market are Haken Audio's "Continuum" fingerboard and Stantum's "Lemur" multi-touch interactive display screens. A dedicated computer in its own right, the Lemur can be programmed to display any graphic imaginable, and map regions of its touch-screen surface to send control messages (MIDI, DMX and some others) out to another device such as a hardware (or software) synth, a lighting controller or even an industrial robot. French duo Daft Punk are enthusiastic fans of the Lemur, using up to six of them in their live performances.

The Lemur is an excellent example of instant visual feedback – you see where and how your fingertips are interacting with a control and your ears and brain are registering changes almost instantly. As good as real physical controls you can pinch between thumb and finger to twist or flick … almost. Until they can develop technology to make screens extrude graspable bumps under computer control, the Lemur is about as good as it gets with display-centric control surfaces. Despite being available since 2005, limited production output had meant the price of a Lemur has been high ($2,000 or so), which is one reason why Stantum decided to close the Lemur division completely in late 2010.

Apple's iPhone and iPod Touch were the first to send shivers up Stantum's spine, and despite their tiny screen size compared to the Lemur, keen programmers were soon coding and releasing apps with a decided musical leaning, including a couple which turned the iPhone into an interactive controller for soft-synths in Logic. Once Apple's $600 iPad came out, the pace really picked up, and several big names (MOTU & Ableton for example) released control-surface apps and several others from independents also became available. The iPad became the "poor muso's Lemur" and Stantum rapidly lost sales.

Don't like Apple? Don't worry. iPad knockoffs running Windows, Android and linux are appearing on the market, some have no multitouch, some don't. It means that programs to turn them into Lemur-alikes will appear on them too, and sooner rather than later. The Lemur was the best thing to happen to electronic music in the last decade, but it took Apple to make the hardware affordable and more accessible to programmers, and once Apple had shown the market was ripe for something every geek said would fail miserably, everyone else decided to hop on the bandwagon. Send in the clones …

2011-06-14

Curséd Are The Geek, For They Shall Infect The Earth

As you're no doubt aware -- you'd have to be living on Pluto to miss it -- Apple have been making leaps and bounds in many fields these past few years, not the least in how happy they're making the bankerscum. With US$72.8 billion in cash reserves, Apple's net worth surpassed Microsoft and Intel combined last week ... yet they don't pay dividends.

What exactly is it that makes Apple so successful? What is it Apple do that makes people shell out premium prices for their iDevices? Why can't other IT companies do what Apple do? I mean, it's not like Apple actually invent anything (despite their press releases). What is it about Apple products that makes their customers go back again and again?

The simple answer: the geeks are not in charge at One Infinite Loop. And this is their single biggest advantage over everyone else, and what allows Apple's profits to climb steadily whilst every other sector of IT is in a net-worth downhill slide that's been going on for nearly two years.

What is it that makes the tech-savvy Windows and/or linux user despise Apple products and deride owners of fruit-bearing electronics as "fanboys" and "stupid" and other derogatory remarks?

Because he resents and fears Apple's approach of taking power out of the hands of the specialists and restoring it to the common man.

Geeks spend a lot of their lives amassing knowledge in how to coerce powerful but badly designed & implemented information technology that other geeks have foisted onto the general public, and along comes Steve with the same technology that does the same things which not only has been redesigned to perform without needing specialist knowledge, but is essentially self-maintaining.

Geeks recommend Windows and *nix and their dependent hardware to people because the problems inherent with them guarantee they'll keep needing to come back to work around those problems. It validates their existence. Geeks who recommend Apple are recommending themselves out of a job.

Geeks are forever coming up with different ways of doing things, adding extra features, more choice, expanded flexibility, more options, without any real understanding of what people actually want. They become puzzled when their new ideas and new methods are ignored by the market, mistakenly assume there was something fundamentally wrong with them, and come up with even more new ideas, more new options, which meet the same reception. Because geeks all share an almost-identical mindset, they become enthused and pick up the new ideas and run with them because they mesh with how they already think, and eventually one variant will catch the attention of a technologically-incompetent power broker and it will end up in something that actually sells.

What geeks seem incapable of is a funtamental aspect of human psychology -- it's not figuring out what people want that matters, it's understanding what people don't want. They don't want a universe of options, they don't want complexity, and they definitely do not want to learn new skills. They don't want choice, and they don't want change. Anything new that comes along has to fit in with what they already know, otherwise they feel uncomfortable. If they are presented with too many choices, they become confused, and frustrated.

People resent being asked to step out of their comfort zones.

For the last thirty-odd years we've had an ever-growing number of technologically masterful but socially inept people (the geek) generating hundreds and thousands of different ways we can use technology, with a fraction of a percent being good enough to gain acceptance with greater humanity. The once vibrant IT industry that grew out of the personal computer boom of the late 70's has become an autistic, wheel-chair bound cripple that moves forward in fits & starts because it is suffering from geek. Code manglers move up the corporate ecosphere to become project managers and CEOs, with new geeks flowing in to fill shoes and make new ideas, the end result is an industry that is socially inept to the very roots and has absolutely no idea of what it is people actually want. It churns out minor variant after minor variant of the same thing over & over, and people buy them because geeks keep telling them they need it. Trouble is, 95% of the variants have problems, or are too complex, or are too expensive in terms of time, efficiency and effectiveness, so it gets dumped and another one bought, and that gets dumped and another one bought.

The end result is that you end up pissing people off with problematic products for so long that they've grown jaded and have gone back to pencil & paper, leaving several hundred thousand geeks scratching their heads wondering where they went wrong.

Along comes Steve Jobs, a strange man with no innate technical know-how, but the gift of seeing the bigger picture, finding out exactly what it is that pisses people off, choosing the right technologies & tricks to eliminate them, and being stubborn enough and charismatic enough to see his visions become reality. Steve is an early adherent to concepts first put forward by PARC alumnus Alan Kay, and his philosophies are what guide Apple:
- Teach computers how humans work, never teach humans how computers work.
- Sweat the small stuff so they don't have to.
- For technology to be accepted, it must become invisible in the environment.
- Interface first.

Business wants to ensure you keep buying and buying, doesn't matter if they make videocards, mobile phones, toothpaste or toilet paper. Apple have looked at what everyone else has been doing, found out what people don't like, and turning the complex into an appliance. Make people's lives easier and they'll happily pay the up front premium, and keep coming back because they finally found something that works like they do. Apple products do not make people more stupid, it makes them complacent and raises their expectations of technology in general.

Brand loyalty arises from satisfying people's desires, and only Apple seem to have perfected the better mousetrap.

Telling people to read the manual is also something Apple don't believe in. Geeks say "If you want the user to read the manual, write a better manual." The user interface & HCI specialists' rejoinder is, "If your product needs a manual then you're doing it wrong."

Atheism & Science Are Religions Too, Y'Know

I cannot help but giggle inanely when people say they are an atheist and that they "believe in science". Why? Constable Dorfl, please step forward:

'Atheism Is Also A Religious Position,' Dorfl rumbled.
'No it's not!' said Constable Visit. 'Atheism is a denial of a god!'
'Therefore It Is A Religious Position,' said Dorfl. 'Indeed, A True Atheist Thinks Of The Gods Constantly, Albeit In Terms of Denial. Therefore, Atheism Is A Form Of Belief. If The Atheist Truly Did Not Believe, He Or She Would Not Bother To Deny.'
'Did you read those pamphlets I gave you?' said Visit suspiciously.
'Yes. Many Of Them Did Not Make Sense. But I Should Like To Read Some More.'
'Really?' said Visit. His eyes gleamed. 'You really want more pamphlets?'
'Yes. There Is Much In Them That I Would Like To Discuss. If You Know Some Priests, I Would Enjoy Disputation.'
'All right, all right,' said Sergeant Colon. 'So are you going to take the sodding oath or not, Dorfl?'
Dorfl held up a hand the size of a shovel. 'I, Dorfl, Pending The Discovery Of A Deity Whose Existence Withstands Rational Debate, Swear By The Temporary Precepts of A Self-Derived Moral System—'
'You really want more pamphlets?' said Constable Visit.
Sergeant Colon rolled his eyes.
'Yes,' said Dorfl.
...

'Excuse Me,' said Dorfl.
'We're not listening to you! You're not even really alive!' said a priest.
Dorfl nodded. 'This Is Fundamentally True,' he said.
'See? He admits it!'
'I Suggest You Take Me And Smash Me And Grind The Bits Into Fragments And Pound The Fragments Into Powder And Mill Them Again To The Finest Dust There Can Be, And I Believe You Will Not Find A Single Atom of Life—'
'True! Let's do it!'
'However, In Order To Test This Fully, One Of You Must Volunteer To Undergo The Same Process.'
There was silence.
'That's not fair,' said a priest, after a while. 'All anyone has to do is bake up your dust again and you'll be alive ...'
There was more silence.
Ridcully said, 'Is it only me, or are we on tricky theological ground here?'
There was more silence.
Another priest said, 'Is it true you've said you'll believe in any god whose existence can be proved by logical debate?'
'Yes.'
Vimes had a feeling about the immediate future and took a few steps away from Dorfl.
'But the gods plainly do exist,' said a priest.
'It Is Not Evident.'
A bolt of lightning lanced through the clouds and hit Dorfl's helmet. There was a sheet of flame and then a trickling noise. Dorfl's molten armour formed puddles around his white-hot feet.
'I Don't Call That Much Of An Argument,' said Dorfl calmly, from somewhere in the clouds of smoke.
(Thanks for that, Terry)

Science is a religion. Really. You go to a church, some guy up the front reads from a big book and says "this is how the world works". You go to a physics tute, some guy up the front reads from a big book and says "this is how the world works." I don't really see a lot of difference, do you?

Look, science says you're meant to question everything until you get proof that doesn't change. But by their own definition, proof can only be ascertained by direct experience. What are you going to do, disbelieve the fundamental laws of physics until you've replicated every single possible experiment for yourself and from that, deduce Boyle's Law and the Theory of Relativity from just your own evidence? Nope, you take it on faith that these guys with more letters after their names than in them know what they're on about.

"We have effectively in science a one party system with a deep commitment to a particular faith. Most scientists, and indeed I myself, have been taught to look at animals, plants and people as being entirely purposeless, living organisms as having originated purely by chance. Lacking any meaning, any value, simply there as animate, automatic mechanisms that can be explained in terms of ordinary physics and chemistry. Many people within science have become very wedded to this mechanistic model, and indeed for some people it has become a kind of religion, and therefore they experience any questioning of this model as an attack on their most fundamental acts of faith. Unfortunately, more and more modern science, no matter which doctrine you care to look at, is rapidly showing that this mechanistic model, this reductionist outlook on science, is proving to be more and more untenable, it just doesn't work, and this of course makes a lot of traditional scientific thinkers very unhappy."
-- Rupert Sheldrake (double first honours Ph.D. in biochemistry at Cambridge, Knox doctorate in philosophy & history at Harvard, Vice Chancellor of Research Fellows at Britain's Royal Society, and Senior Scholar of the Perrott-Warwick School of Parapsychology at Trinity College, Cambridge)

"Possibly the most absurb belief amongst my peers is that what we know [as the body of scientific knowledge] is immutable, and that once a theory has been proven by the scientific method the knowledge thus gained is unchangeable. Too much importance is placed on what we do know, and far too much effort spent on discrediting and denying anything which does not readily slot into place within the current body of scientific knowledge, or anything which attempts to change that knowledge with new facts. What is truly important is what we still don't know." -- Sir Martin Rees (Astronomer Royal, Baron Rees of Ludlow, Plumian Professor of Cosmology & Astrophysics, Cambridge)

Accept nothing. Question everyting. All truths are subjective, all facts are relative. The only thing in the entire universe which does not change, is change. The only thing you, as a human being, can ever accept as a truth or a fact is what enters your brain by direct sensorial input. Anything that happens outside the confines of your own skull is thus, by its very nature, questionable.

Reality only exists because we agree to it. When reality shifts, the greement has been changed.

Joseph Campbell, mythologist and theologian, once examined the core teachings of all the world's major and many minor religions, and discovered that the similarities were far too frequent to be considered coincidence. Central to virtually every religion and every system of siritual belief we find the Golden Rule:
• Baha'i: "Lay not on any soul a load that you would not wish to be laid upon you, and desire not for anyone the things you would not desire for yourself." -- Baha'u'llah, Gleanings
• Hinduism: "This is the sum of duty: do not do to others what would cause pain if done to you." -- Mahabharata 5:1517
• Buddhism: "Treat not others in ways that you yourself would find hurtful." -- Udana-Varga 5:18
• Taoism: "Regard your neighbour's gain as your own gain, and your neighbour's loss as your loss." -- T'ai Shang Kari Ying P'ien 213-218
• Christianity: "In everything, do to others as you would have them do to you, for this is the law." -- Jesus, Gospel of Matthew 7:12
• Unitarianism: "We affirm and promote respect for the interdependent web of all existence of which we are a part." -- Unitarian Principle
• Judaism: "What is hateful to you, do not do to others. This is the whole Torah; all the rest is commentary." -- Hillel, Shabbat 31a
• Islam: "Not one of you truly believes until you wish for others what you wish for yourself." -- The Prophet Muhammad, Hadith
• Wicca / Paganism: "As so long as ye harm none, do what you will be the whole of the law."
Y'know what's sad? The two fastest-growing religious cults -- atheism and science -- don't have a golden rule.