2011-12-24

Has The Soft-synth Killed Synthesis?

1960: In The Beginning, Bob Created The Filter

If you don't know who Bob Moog is, you don't deserve to own a synth (virtual or otherwise) and should leave the internet, find a Strat and a monkey with a drumkit and stick to doing covers of Elvis or The Jams – I hear there's good money to be made on the cabaret circuit :)

Even though Bob had some musical background, his first circuits were never intended to be used as musical instruments, but when a friend heard the first tones and warbles and commented that it sounded like something from the movie 'Forbidden Planet' he changed his mind, figured out how to add an old organ keyboard, and the rest is history. Musical academics soon queued up for one, then the musique concrete crowd, then prog rockers started calling. Walter Carlos' synthetic orchestrations for 'A Clockwork Orange' made the Moog a household name, and Keith Emerson showed what it could do for rock, which led Bob to taking a few key circuits from his wall-filling Series 900 boxes and creating the Minimoog.

1970: Synthetic Cows And The Polybeast - Thanks For The Memories

Seeing the Minimoog being used as a successful live instrument enticed others to make something, such as the ARP Odyssey, the EMS Synthi and Tom Oberheim's SEM – a synth-in-a-box meant to be plugged into a Minimoog but which ended up with a keyboard of its own. The synth escaped the realm of prog-rock and was soon found in classical performance, jazz, and pop-music, as well as creating its own horrendous sub-genre of novelty albums full of cover-tunes played on synths and inevitably using "Moog" in the title even if the synths were from other makers. Thanks to pulp-musicians like Hugo Montenegro, we now know that plastic cows say "moog".

I know. Sad, isn't it.

Despite this recognition, synths were like a saxophone in that they could only make one note or sound at once, and there was no way to save a particular arrangement of knobs & switches except by human memory alone. Some people used patch-sheets to draw knob positions, some people took Polaroid photos of the front panel, and Rick Wakeman would glue the knobs of his Minimoog into place once he got a sound he liked, and pull another Minimoog out of his collection. Moog tried to solve the first problem with their Polymoog, but it was closer to an organ than a synth, so that's why Yamaha's CS50 released a year after in 1976 is considered the first true polyphonic synth. Yamaha's attempt to solve the second problem of remembering sounds was odd – on the CS50's big brother, the massive GX1, there were four boxes on the top, each holding what was essentially a miniaturised but complete copy of the front panel.

The big breakthrough that solved both problems came in 1978 with Sequential Circuits' Prophet 5 – digitally controlled analogue oscillators and filters that could handle a five-finger chord coupled with a small CPU and battery-backed RAM that could save every control's setting in ten (and shortly after, one hundred) patch memories.

1980: The First Rise & Fall Of Originality

As the 80's progressed, the focus on electronic synthesis shifted away from rich prog-rockers and academics to English punk bands, which led to wide acceptance by pop & rock bands once tracks like "Fade To Grey" by Visage and New Order's "Blue Monday" became top-ten hits. Aspiring bands the world over wanted synth-players as well as drummers and bassists and guitarists, and several American and Japanese companies soon had Prophets and Odysseys and Jupiters next to the Fender guitars and the Zyldjian cymbals. One corner of the shop would have staff yelling at guitarists to stop playing "Stairway To Heaven", the other had staff yelling at keyboardists to stop playing "Jump" and "Oxygene".

As more and more broadcast music had synthetic content in some form, synth-nerds would often play synth-spotting – instruments from particular makers often had a recognisable character, which was pretty much the only clues we had because every synth-player made their own sounds. Many synths of the day had no patch-memory, or at least very few, and those with patch memory often had all of the factory sounds overwritten. Watching music videos, many of the synth-players had one hand on the keyboard and the other flying around the controls, twirling knobs and flicking switches as they played. Brian Eno when he was playing with Roxy Music was a good example, as was Manfred Mann in the clip for "Blinded By The Light". At the other end of the spectrum, successful synthesists would be seen with the $30,000 Fairlight (Herbie Hancock's "Rock It") or the $100,000 Synclavier (Tony Clark in Dire Straits' "Money For Nothing").

Then the music died.

On November 22nd 1983, the market's first affordable all-digital synthesiser was launched – the Yamaha DX-7.

Unlike every other synth before it, the DX-7 did away with analogue-based subtractive synthesis in favour of FM (frequency modulation) synthesis, which made for incredibly accurate bell, chime, brass, percussion and piano sounds. It also had 32 slots of patch memory, and in another first, had a cartridge slot where more patches could be stored. Compared to the twenty or so knobs and faders analogue synths usually had, the DX-7 had over 180 separate parameters that combined to make up a single patch or sound ... but they were all buried behind a 2x16 character display and one – count 'em, one – fader. The range of sounds the DX could produce was huge, and it could mimic real instruments much better than anything analogue ever could. It also had, compared to its competition, a ridiculously low price. The DX-7 sold like hotcakes. Air-play became full of bands that sounded a lot like each other, and in almost every one of them the DX-7 'E.Piano2' or 'Marimba' or 'Bell1' factory patches could be heard.

The other two big Japanese synth-makers, Korg and Roland, quickly realised that by adopting digital techniques the price of making the actual hardware of a synth plummeted dramatically. Roland introduced Linear Arithmetic synthesis with over four hundred parameters per patch, Korg went for a mix of tiny sampled waveforms from real instruments and processor control of analogue filter-chips (with up to 200 parameters per patch, depending on model), whilst newcomer Kawai brought cheap additive synthesis to the market, once the domain of supersynths like the Synclavier. These synths all had the same over-all characteristics of a bland front panel, an LCD display, never more than four physical knobs or faders, a depressingly small keypad, a card or cartridge slot, and MIDI jacks. Thanks to MIDI, they could even dispose of the keyboard, and the rack-mount synth module was born. The big Americans, Sequential, Moog and Oberheim, stuck mostly with their analogue synths and tried to increase flexibility whilst keeping as many knobs as possible, but the cost of all those physical controls couldn't let them compete with the Japanese. Even after adopting the same approach, machines like the Oberheim Matrix1000, the Prophet VS and Moog Source could not save the companies.

This made creating new sounds a major chore for most musicians – gone was the ability to reach out and tweak a knob to get an instant change in the sound, instead you had to know your synthesis and know your synth, press a few buttons to reach the parameter you wanted, then press a few more buttons or move the data entry knob to the new value, and then play the keyboard. None of the early digital synths let you change the sound in real-time, you couldn't hold a key down and change a parameter's value and hear what the difference was in the sound without letting go and pressing the key again.

A new market sprang up almost overnight – pre-programmed cartridges full of patches made by other people. Personal computers of the day – Apple II's, Commodore C64's and Ataris – grew MIDI interfaces and patch librarian programs. Soon you could buy a thousand new sounds on a single floppy disk. No-one bought synths to make their own unique sounds any more, unless you were one of the very few anal-retentives who made sounds to sell.

Sampling – recording a sound and then altering the playback speed of the sound to change pitch – had been around as long as the personal computer had been, but as a stand-alone musical instrument with good audio quality you had to spend big bucks on a Fairlight or Synclavier.

An offshoot of cheap digitally controlled synthesis was the introduction of affordable hardware based samplers, such as Emu Systems' Emulator, the Ensoniq Mirage, and the Akai S1000. The patch-library companies were soon adding disks full of samples to their catalogues, but it was Akai's decision to add a CD-ROM drive peripheral that made it become the studio choice, and cemented the Akai sample-format as an industry standard. The Aussies continued to battle on, but the sample-driven Fairlight could not compete against machines one twentieth of the price, and the Synclavier hung on for a few more years as a high end machine thanks to a few huge hard-drive-only sampler additions and the Denny Yaeger String Library, and only because it was cheaper to hire a Synclavier than it was to get the London Symphony Orchestra in to do your film-score.

If you couldn't compete on price, you soon died. Fairlight got out of the music game in 1989, and New England Digital folded completely in 1993. Sequential Circuits was bought up by Yamaha, Oberheim got bought by Gibson and stopped making synths, Voyetra and many other smaller makers died complete, Kawai faded into obscurity, and Bob Moog hung on by the skin of his teeth making expensive theremins. By 1992, eighty percent of all synths sold bore a badge reading Roland, Yamaha, or Korg, and every device had that telltale single display, single control-fader and a grillion parameters to edit.

1990: A Virtual Birth

In 1995, a small Swedish company took a couple of the world's fastest DSP chips, a handful of cheap knob-workalikes taken from computer mice, and some very clever software that could mathematically replicate analogue circuitry, put it all into a box, added a keyboard, and painted it red. Clavia's Nord Lead was the first "virtual analogue" synth, with the mission of recreating not only the warmth and character of the old analogue synths of yesteryear, but the visceral hands-on joy of real-time knob tweakery. The Nord became the most-sought synth for both reasons, and prompted some other companies to release similar knob-infested synths such as the Alesis A6 Andromeda, Viscount's Oberheim OB12 and the Waldorf Q. A German newcomer, Access, had a huge hit with their knobbly Virus, and the Japs soon hopped on board with Roland's JP-8000, Yamaha's AN1x and Korg's MS2000 all vying for the gigging synthesist's attention (and money).

By 2002, it was hard to buy a synth which was not covered in knobs – interactive sound creation was once again in the hands of the musician, and the patch-sellers and sample-sellers started to feel the pinch … for a while.

2000: Komputerwelt

It is around this time two things happened that revolutionised the music business – a massive jump in processing power in affordable personal computers (Apple's PowerMacintosh G3 was the first personal computer to earn a supercomputer classification from the Pentagon!) and massive global adoption and use of the internet.

Whilst it's true that the use of computers in music-making has a long history – the Australian computer "CSIRAC" was the first to play music directly in 1951 – the realm of computer-generated audio synthesis had been a specialty area for some time, and there were no real standards. If you weren't using a dedicated synth-chip on a sound-card, then you needed some impressive CPU grunt to math out the waveforms in something resembling realtime. Early soft-synth programs such as ReBirth, Koblo and Metasynth could easily max out a state of the art PC or Mac if you tried to do a lot at once, and you could forget trying to run more than one synth at a time, too.

Enter Digidesign and Steinberg, two companies that had been in the audio industry since 1984, with almost identical competing products, Pro Tools and Cubase. Both products started off as MIDI-only sequencers at almost the same time, both added realtime audio multitracking at around the same time, and both came up with the concept of a modular plug-in architecture at around the same time. Pro Tools was the professional studio's choice more often than not, because Pro Tools relied heavily on Digidesign's own high end hardware. Cubase on the other hand could make use of audio hardware from many different makers, thanks to the ASIO standard Steinberg developed, and was more often found in semi-pro and hobbyist studios.

Digidesign's concept for plug-ins was quite clever – they were to be written in such a way that they can be run either directly in the host computer's CPU or loaded into a PCI card full of DSP chips. This was great for pro studios who could afford to add four or more 'DSP farms' to a PC or Mac and run a dozen convolution reverbs, twenty mastering limiters and a few synths & other effects on a 100 MHz PowerPC or Pentium. Not great if you've spent all your money & got the fastest computer at the shop and still be limited to three synths and three effects plug-ins before the audio output starts to stutter.

Steinberg decided to make their plug-ins CPU-dependent, but opened the spec, knowing that computing power was on the rise and eventually there would be cheap computers powerful enough to run dozens of the most complex bits of software ever devised without breathing hard.

2005: The Second Rise & Fall Of Originality

Steinberg's double whammy of ASIO and VST standards soon meant that to recreate a full electronic music studio, all you needed was a decent computer and a decent sound card. However, because everything was happening on a computer screen, everything was controlled by proxy. For some users, having to edit each control one at a time using a computer mouse was about as enjoyable as creating a new DX-7 patch from scratch. Having to shift from left-brain creative mode to right-brain logical operate-and-control mode constantly was, they said, the fastest way to kill the creation process and make any recording session one long act of frustration.

Music-making is, after all, an intimately interactive experience, and anything which adds time to the idea of a note in your brain from triggering your fingers and that note reaching your ears is a bad thing. In fact, no matter what the task at hand, the time that encompasses thought-action-reaction needs to be kept as short as possible, preferably less than 200 milliseconds. Interactivity is key, and the interface between you, your thoughts, and the desired outcome from the device you are manipulating must be as intuitive to use as possible. Apple know this, and it's why the iPhone became the most-sought-after and most-emulated smartphone less than six months after launch.

This is also why having lots of physical controls directly under your fingertips is so vital to a smooth flow of creative juices, and is why we have a healthy industry making control-surfaces that talk MIDI and/or USB and have all manner and number of knobs, faders, buttons, pads, sticks and other controls that can be mapped to a virtual synth or effect's controls and thus be recordable. Many computer musicians will be running a dozen or more virtual instruments and effects in a single session, and it is frustrating to have to remember to change a control-surface's internal program to match the front-most plug-in. To reduce some of that frustration, smart controllers like Mackie's Universal Control system or Novation's ReMote-SL stuff can help a lot by watching the screen for you and changing their settings to whatever VST you have focused on.

For some, even this is not enough. Still too slow. Still not enough feedback.

Alas, this is extra hardware which requires extra money, and a lot of budding bedroom musicians have little or no money to spend on such things – in fact, a large proportion of computer musicians didn't even pay for the software they use thanks to the pervasive internet and ease in which illegitimate copies of software can be downloaded and used without worry. Because of the frustration factor I describe above, most users never bother to get minutely involved with controlling their plug-ins and once again rely solely on presets.

2010: Plug In, Browse On, Vague Out

So, the present-day answer to the original question – has the softsynth killed synthesis – the answer is no. Not directly, anyway. So what did kill the art of synthesis? Partly tight-arsed penny pinching (including outright theft), partly intrinsic human laziness, but mostly the reduction in interactivity – once again each aspect of a sound is removed from direct, instant-feedback interaction.

Once again we are seeing presets collections being offered for sale and/or download. We are also seeing a massive increase in the number and size of sample libraries. Unfortunately, neither of these helps creativity, and in fact hinders it to a very large extent – we become buried in a morass of patches, programs, plug-ins and samples to choose from, and flicking through the hundreds and thousands of sounds for that one special thing makes us forget what we're looking for. Jean Michel Jarre sums it up rather eloquently:

"Digital technology took electronic music down a blind alley. Musicians were compelled to work in an increasingly cerebral and abstract fashion on samples of existing sound rather than creating original sounds as they had with the first wave of synthesisers.

"The excitement of being able to work on sounds in a tactile, manual, almost sensual way is what drew me to electronic music in the first place. The lack of limitations is very dangerous. It is like the difference for a painter of getting four tubes with four main colours or being in front of a computer with two million colours. You have to scan the two million colours and when you arrive to the last one you have obviously forgotten the first one. In the Eighties we became archivists and everything became rather cold as a result."


Noted composer and synthesist-non-pareil Evangelos Papathanassiou – better known as Vangelis – has almost identical thoughts:

"The way music technology has been designed today, it takes you away from the spontaneity and from the human touch, because you have to follow the machines, and program. I'm not doing that. Everything I do is not pre-programmed, it is done on the spot. One take. Everything is done live, never programmed.

"Comparing the technology I have today with what I had when I did Chariots Of Fire, there is nothing. I have a wider choice of sound, and the sound quality is better. But the system I use is exactly the same, I never change the approach, it is always live. All I've changed is the source of sound, that's all.

"The playability of modern synthesisers is a big problem, you have to use computers, you have to program, you have to choose, and these kinds of things ... I can't do it that way, so I use my own system to access the sounds, to bypass this difficulty, so that it is instant, immediate.

"As you create sound textures and qualities, so you create the composition."


Less Is More

So what's the solution?

First thing to tick off the checklist: Get a good controller with plenty of knobs and/or faders and MIDI keyboard. If they are both in the same box, even better – plenty to choose from but you can't go wrong with the Novation ReMote SL25. Until you do, you're not making music. If you're determined to stick with a computer keyboard and a mouse and nothing else, you're not a musician. You're a a bit-shuffler, a tinkerer, you don't even rate as an audio sculptor. Chimpanzees can make better rhythms slapping the ground bare-handed than you could by clicking beat-grids in FL, a whale's fart has a better melodic hook than your quantised mouse-drags on a Cubase piano-scroll.

Second is to stop being an archivist. Stop wasting time downloading every VST, patch-set and sample library you see a link to. He who dies with the biggest collection of plugs & samples is the biggest loser of them all. Cure yourself of versionitis, that compulsive affliction to make sure you have the uber-latest version of everything – the world isn't going to end just because you're on version 1.02 when everyone else is using version 4. Newer is not automatically better.

Third: Reduce your palette. One DAW package (can't go wrong with Live, but Tracktion and REAPER also have good interfaces). No more than six soft-synths, one sampler. One delay, one or two reverbs, one compressor or comp/limiter, a gater if you're into dance music. Up to four "weird" sound-manglers. Once you've settled on your plug-ins (any more than ten in total defeats the purpose) delete the rest. That's right, delete them, get rid of them, at the very least dump them to DVD or spare hard drive and leave the backups with a friend or family.

Finally, learn your VSTs backwards. Experiment with every control, every value, learn where the sonic texture boundaries lie. Set up your control-surface so that controls for similar functions across VSTs are in the same location on the control-surface. The aim is to be able to reach out and tweak a control to move closer to the sound in your head without having to think about it – it needs to be intuitive, something you can do almost by instinct. Once you think you're getting a good grip on what a VST can or can't do, delete all the presets it came with. Once you have mastered your less-than-a-dozen instruments and effects, then and only then can you think about adding another synth or effect to the collection.

Claude Monet painted 'Still-Life with Anemones' with eight individual pigments. Leonardo use fourteen on 'Mona Lisa'. Like a good painter, you need to devote time on mixing your palette of available colours and textures together, learning what works and what doesn't. You'll find that with less, you can do more than you thought possible. A lot more.

iPad, You Pad, We All Pad - Let Your Fingers Do The Tweaking

Over the decades, academic musicians such as Steve Vorhaus, Tod Machover and Don Buchla have focused on different methods of interacting to control sound in a performance, and occasionally something clicked and actually made it to the market – things like the ribbon controller, Yamaha and Akai's breath controller, and the Chapman Stick. Toshio Iwai created the odd multibutton "Tenori-On" in 2005, which Yamaha brought to market in 2007 and which spawned the Monome, a low cost, easily modifiable button-grid controller that could be put to all sorts of uses.

The last few years has seen rapid development in the use of touch-screen technology, the two most notable examples to have reached the market are Haken Audio's "Continuum" fingerboard and Stantum's "Lemur" multi-touch interactive display screens. A dedicated computer in its own right, the Lemur can be programmed to display any graphic imaginable, and map regions of its touch-screen surface to send control messages (MIDI, DMX and some others) out to another device such as a hardware (or software) synth, a lighting controller or even an industrial robot. French duo Daft Punk are enthusiastic fans of the Lemur, using up to six of them in their live performances.

The Lemur is an excellent example of instant visual feedback – you see where and how your fingertips are interacting with a control and your ears and brain are registering changes almost instantly. As good as real physical controls you can pinch between thumb and finger to twist or flick … almost. Until they can develop technology to make screens extrude graspable bumps under computer control, the Lemur is about as good as it gets with display-centric control surfaces. Despite being available since 2005, limited production output had meant the price of a Lemur has been high ($2,000 or so), which is one reason why Stantum decided to close the Lemur division completely in late 2010.

Apple's iPhone and iPod Touch were the first to send shivers up Stantum's spine, and despite their tiny screen size compared to the Lemur, keen programmers were soon coding and releasing apps with a decided musical leaning, including a couple which turned the iPhone into an interactive controller for soft-synths in Logic. Once Apple's $600 iPad came out, the pace really picked up, and several big names (MOTU & Ableton for example) released control-surface apps and several others from independents also became available. The iPad became the "poor muso's Lemur" and Stantum rapidly lost sales.

Don't like Apple? Don't worry. iPad knockoffs running Windows, Android and linux are appearing on the market, some have no multitouch, some don't. It means that programs to turn them into Lemur-alikes will appear on them too, and sooner rather than later. The Lemur was the best thing to happen to electronic music in the last decade, but it took Apple to make the hardware affordable and more accessible to programmers, and once Apple had shown the market was ripe for something every geek said would fail miserably, everyone else decided to hop on the bandwagon. Send in the clones …

2011-06-14

Curséd Are The Geek, For They Shall Infect The Earth

As you're no doubt aware -- you'd have to be living on Pluto to miss it -- Apple have been making leaps and bounds in many fields these past few years, not the least in how happy they're making the bankerscum. With US$72.8 billion in cash reserves, Apple's net worth surpassed Microsoft and Intel combined last week ... yet they don't pay dividends.

What exactly is it that makes Apple so successful? What is it Apple do that makes people shell out premium prices for their iDevices? Why can't other IT companies do what Apple do? I mean, it's not like Apple actually invent anything (despite their press releases). What is it about Apple products that makes their customers go back again and again?

The simple answer: the geeks are not in charge at One Infinite Loop. And this is their single biggest advantage over everyone else, and what allows Apple's profits to climb steadily whilst every other sector of IT is in a net-worth downhill slide that's been going on for nearly two years.

What is it that makes the tech-savvy Windows and/or linux user despise Apple products and deride owners of fruit-bearing electronics as "fanboys" and "stupid" and other derogatory remarks?

Because he resents and fears Apple's approach of taking power out of the hands of the specialists and restoring it to the common man.

Geeks spend a lot of their lives amassing knowledge in how to coerce powerful but badly designed & implemented information technology that other geeks have foisted onto the general public, and along comes Steve with the same technology that does the same things which not only has been redesigned to perform without needing specialist knowledge, but is essentially self-maintaining.

Geeks recommend Windows and *nix and their dependent hardware to people because the problems inherent with them guarantee they'll keep needing to come back to work around those problems. It validates their existence. Geeks who recommend Apple are recommending themselves out of a job.

Geeks are forever coming up with different ways of doing things, adding extra features, more choice, expanded flexibility, more options, without any real understanding of what people actually want. They become puzzled when their new ideas and new methods are ignored by the market, mistakenly assume there was something fundamentally wrong with them, and come up with even more new ideas, more new options, which meet the same reception. Because geeks all share an almost-identical mindset, they become enthused and pick up the new ideas and run with them because they mesh with how they already think, and eventually one variant will catch the attention of a technologically-incompetent power broker and it will end up in something that actually sells.

What geeks seem incapable of is a funtamental aspect of human psychology -- it's not figuring out what people want that matters, it's understanding what people don't want. They don't want a universe of options, they don't want complexity, and they definitely do not want to learn new skills. They don't want choice, and they don't want change. Anything new that comes along has to fit in with what they already know, otherwise they feel uncomfortable. If they are presented with too many choices, they become confused, and frustrated.

People resent being asked to step out of their comfort zones.

For the last thirty-odd years we've had an ever-growing number of technologically masterful but socially inept people (the geek) generating hundreds and thousands of different ways we can use technology, with a fraction of a percent being good enough to gain acceptance with greater humanity. The once vibrant IT industry that grew out of the personal computer boom of the late 70's has become an autistic, wheel-chair bound cripple that moves forward in fits & starts because it is suffering from geek. Code manglers move up the corporate ecosphere to become project managers and CEOs, with new geeks flowing in to fill shoes and make new ideas, the end result is an industry that is socially inept to the very roots and has absolutely no idea of what it is people actually want. It churns out minor variant after minor variant of the same thing over & over, and people buy them because geeks keep telling them they need it. Trouble is, 95% of the variants have problems, or are too complex, or are too expensive in terms of time, efficiency and effectiveness, so it gets dumped and another one bought, and that gets dumped and another one bought.

The end result is that you end up pissing people off with problematic products for so long that they've grown jaded and have gone back to pencil & paper, leaving several hundred thousand geeks scratching their heads wondering where they went wrong.

Along comes Steve Jobs, a strange man with no innate technical know-how, but the gift of seeing the bigger picture, finding out exactly what it is that pisses people off, choosing the right technologies & tricks to eliminate them, and being stubborn enough and charismatic enough to see his visions become reality. Steve is an early adherent to concepts first put forward by PARC alumnus Alan Kay, and his philosophies are what guide Apple:
- Teach computers how humans work, never teach humans how computers work.
- Sweat the small stuff so they don't have to.
- For technology to be accepted, it must become invisible in the environment.
- Interface first.

Business wants to ensure you keep buying and buying, doesn't matter if they make videocards, mobile phones, toothpaste or toilet paper. Apple have looked at what everyone else has been doing, found out what people don't like, and turning the complex into an appliance. Make people's lives easier and they'll happily pay the up front premium, and keep coming back because they finally found something that works like they do. Apple products do not make people more stupid, it makes them complacent and raises their expectations of technology in general.

Brand loyalty arises from satisfying people's desires, and only Apple seem to have perfected the better mousetrap.

Telling people to read the manual is also something Apple don't believe in. Geeks say "If you want the user to read the manual, write a better manual." The user interface & HCI specialists' rejoinder is, "If your product needs a manual then you're doing it wrong."

Atheism & Science Are Religions Too, Y'Know

I cannot help but giggle inanely when people say they are an atheist and that they "believe in science". Why? Constable Dorfl, please step forward:

'Atheism Is Also A Religious Position,' Dorfl rumbled.
'No it's not!' said Constable Visit. 'Atheism is a denial of a god!'
'Therefore It Is A Religious Position,' said Dorfl. 'Indeed, A True Atheist Thinks Of The Gods Constantly, Albeit In Terms of Denial. Therefore, Atheism Is A Form Of Belief. If The Atheist Truly Did Not Believe, He Or She Would Not Bother To Deny.'
'Did you read those pamphlets I gave you?' said Visit suspiciously.
'Yes. Many Of Them Did Not Make Sense. But I Should Like To Read Some More.'
'Really?' said Visit. His eyes gleamed. 'You really want more pamphlets?'
'Yes. There Is Much In Them That I Would Like To Discuss. If You Know Some Priests, I Would Enjoy Disputation.'
'All right, all right,' said Sergeant Colon. 'So are you going to take the sodding oath or not, Dorfl?'
Dorfl held up a hand the size of a shovel. 'I, Dorfl, Pending The Discovery Of A Deity Whose Existence Withstands Rational Debate, Swear By The Temporary Precepts of A Self-Derived Moral System—'
'You really want more pamphlets?' said Constable Visit.
Sergeant Colon rolled his eyes.
'Yes,' said Dorfl.
...

'Excuse Me,' said Dorfl.
'We're not listening to you! You're not even really alive!' said a priest.
Dorfl nodded. 'This Is Fundamentally True,' he said.
'See? He admits it!'
'I Suggest You Take Me And Smash Me And Grind The Bits Into Fragments And Pound The Fragments Into Powder And Mill Them Again To The Finest Dust There Can Be, And I Believe You Will Not Find A Single Atom of Life—'
'True! Let's do it!'
'However, In Order To Test This Fully, One Of You Must Volunteer To Undergo The Same Process.'
There was silence.
'That's not fair,' said a priest, after a while. 'All anyone has to do is bake up your dust again and you'll be alive ...'
There was more silence.
Ridcully said, 'Is it only me, or are we on tricky theological ground here?'
There was more silence.
Another priest said, 'Is it true you've said you'll believe in any god whose existence can be proved by logical debate?'
'Yes.'
Vimes had a feeling about the immediate future and took a few steps away from Dorfl.
'But the gods plainly do exist,' said a priest.
'It Is Not Evident.'
A bolt of lightning lanced through the clouds and hit Dorfl's helmet. There was a sheet of flame and then a trickling noise. Dorfl's molten armour formed puddles around his white-hot feet.
'I Don't Call That Much Of An Argument,' said Dorfl calmly, from somewhere in the clouds of smoke.
(Thanks for that, Terry)

Science is a religion. Really. You go to a church, some guy up the front reads from a big book and says "this is how the world works". You go to a physics tute, some guy up the front reads from a big book and says "this is how the world works." I don't really see a lot of difference, do you?

Look, science says you're meant to question everything until you get proof that doesn't change. But by their own definition, proof can only be ascertained by direct experience. What are you going to do, disbelieve the fundamental laws of physics until you've replicated every single possible experiment for yourself and from that, deduce Boyle's Law and the Theory of Relativity from just your own evidence? Nope, you take it on faith that these guys with more letters after their names than in them know what they're on about.

"We have effectively in science a one party system with a deep commitment to a particular faith. Most scientists, and indeed I myself, have been taught to look at animals, plants and people as being entirely purposeless, living organisms as having originated purely by chance. Lacking any meaning, any value, simply there as animate, automatic mechanisms that can be explained in terms of ordinary physics and chemistry. Many people within science have become very wedded to this mechanistic model, and indeed for some people it has become a kind of religion, and therefore they experience any questioning of this model as an attack on their most fundamental acts of faith. Unfortunately, more and more modern science, no matter which doctrine you care to look at, is rapidly showing that this mechanistic model, this reductionist outlook on science, is proving to be more and more untenable, it just doesn't work, and this of course makes a lot of traditional scientific thinkers very unhappy."
-- Rupert Sheldrake (double first honours Ph.D. in biochemistry at Cambridge, Knox doctorate in philosophy & history at Harvard, Vice Chancellor of Research Fellows at Britain's Royal Society, and Senior Scholar of the Perrott-Warwick School of Parapsychology at Trinity College, Cambridge)

"Possibly the most absurb belief amongst my peers is that what we know [as the body of scientific knowledge] is immutable, and that once a theory has been proven by the scientific method the knowledge thus gained is unchangeable. Too much importance is placed on what we do know, and far too much effort spent on discrediting and denying anything which does not readily slot into place within the current body of scientific knowledge, or anything which attempts to change that knowledge with new facts. What is truly important is what we still don't know." -- Sir Martin Rees (Astronomer Royal, Baron Rees of Ludlow, Plumian Professor of Cosmology & Astrophysics, Cambridge)

Accept nothing. Question everyting. All truths are subjective, all facts are relative. The only thing in the entire universe which does not change, is change. The only thing you, as a human being, can ever accept as a truth or a fact is what enters your brain by direct sensorial input. Anything that happens outside the confines of your own skull is thus, by its very nature, questionable.

Reality only exists because we agree to it. When reality shifts, the greement has been changed.

Joseph Campbell, mythologist and theologian, once examined the core teachings of all the world's major and many minor religions, and discovered that the similarities were far too frequent to be considered coincidence. Central to virtually every religion and every system of siritual belief we find the Golden Rule:
• Baha'i: "Lay not on any soul a load that you would not wish to be laid upon you, and desire not for anyone the things you would not desire for yourself." -- Baha'u'llah, Gleanings
• Hinduism: "This is the sum of duty: do not do to others what would cause pain if done to you." -- Mahabharata 5:1517
• Buddhism: "Treat not others in ways that you yourself would find hurtful." -- Udana-Varga 5:18
• Taoism: "Regard your neighbour's gain as your own gain, and your neighbour's loss as your loss." -- T'ai Shang Kari Ying P'ien 213-218
• Christianity: "In everything, do to others as you would have them do to you, for this is the law." -- Jesus, Gospel of Matthew 7:12
• Unitarianism: "We affirm and promote respect for the interdependent web of all existence of which we are a part." -- Unitarian Principle
• Judaism: "What is hateful to you, do not do to others. This is the whole Torah; all the rest is commentary." -- Hillel, Shabbat 31a
• Islam: "Not one of you truly believes until you wish for others what you wish for yourself." -- The Prophet Muhammad, Hadith
• Wicca / Paganism: "As so long as ye harm none, do what you will be the whole of the law."
Y'know what's sad? The two fastest-growing religious cults -- atheism and science -- don't have a golden rule.

A Potted History of Jesus Christ (or, What The Church Doesn't Want You To Know)

Issua ibn Iosep Hajeduhim ba Bet'lehim was an itinerant wanderer of Arabic descent from the regions surrounding the Dead Sea of the Roman Imperial District of Judea, in what is now known as Jordan. Brought up as a strict Essene Jew, as a child he learned the simple ascetic belief-system that governed the Essene way of life. His calm yet inquisitive nature, excellent memory, and aptitude with both the spoken and written forms of his native tongue, meant it was expected he would enter the priesthood, but he ultimately left his family & community as a young man of some fifteen or sixteen years of age to find his own way in the world instead.

Some years later, Issua ended up at a Buddhist monastery in the Jagannath region of western India, an area still rich in a blend of basic Hindu and Buddhist beliefs. Monastic texts described him as "a young Yida [Arab], dark of skin but light of spirit for the ways of Buddha are as air to him", who became an acolyte and then a mendicant teacher and healer for the region. After a few years, he decided to return to his homeland.

After rejoining the Essene communities of Jerusalem, he became dismayed to see how downtrodden the people had become under the rule of the Romans and the ruling theocrats, the Pharisee Jewish elite, and began to teach others the ways of living he had learned amongst the people of India, who still managed to lead a fulfilling, harmonious life despite the opression of the Brahmin rulers. Like any idea which seemed to offer escape from the opression of their elders, the young people took to Issua's ideas with enthusiasm, and despite his upbringing this notoriety prompted him to resume his mendicant ways, traveling across Judea teaching this new belief. Like Siddhartha Gautama did several hundred years before him, he roamed the country from vilage to town to city with several self-styled disciples and acolytes, using his fame as a healer to attract people, and his natural talent as a storyteller to spread his ideas wrapped up in parables.

With notoriety as both a highly skiller healer and spellbinding storyteller spreading before him, the sick, the infirm, the dissatisfied and the curious flocked to the larger towns along Issua's route to the northern border. it did not take long for news of the crowds Issua was getting to reach the ears of the Holy Roman Empire and the Pharisees; Issua's basic message was one of respect, harmony and self-trust, and as all he was doing was teaching people how to have a better life and was not actively advocating rebellion against the Romans or the Pharisees, they were content to just keep a watchful eye.

During a stop-over to heal and preach in the large town of Migdal, on the north-western shore of the Sea of Galilee, Issua healed a young woman named Miryam, and she became a daily visitor to Issua's camp to hear him speak. Sharp of mind herself, her frequent and insightful questions caught Issua's attention, and being both young adults, mutual attraction took its course and they fell in love. As they were both practicing Jews, they did it 'by the book', petitioned Miryam's family for permission to marry, and on return to Migdal several months later, were married in the large synagogue that served the region around Galilee. Miryam's insights and deep understanding of Issua's teachings ensured her place among the inner circle of Issua's growing ministry, and disciples would often turn to her for interpretations and understanding, as well as being the public female face of Issua's movement. Miryam eventually bore two children by Issua, a son and a daughter.

When any public movement grows large enough, you will get hot-heads, and Issua's ministry had its fair share of militant idealists. Issua and the inner circle tried to distance themselves from the hot-heads, but this was all the Romans & Pharisees needed to deal with what they were now seeing as a political threat to their hold on power. In Jerusalem, he was taken at spear-point by Roman legionaries, made to sit through a sham trial of heresy by the Pharisees using a paid stooge to testify against Issua, and then sentenced by the Pilate to be executed by crucifiction, ironically only a few miles north of where he had been born just over thirty years earlier.

Some three hundred years later, with the rapid rise to popularity of Issua's version of Judaism threatening to replace the religous pantheon of gods that formed the backbone of the failing Holy Roman Empire, Emperor Constantine set plans into motion to adopt this new religion as the official Roman religion. Realising that the Roman empire could no longer maintain its absolute rule as a power through military might, the only chance the Empire had of surviving was to become a theocratic power instead. To this end, the First Council of Nicaea was formed to gather, codify and edit a mixture of the available gospels with the brutal aspects of Rome's vengeful pantheon of gods so they were acceptable to the Roman people. As the Roman people (and their subjugated cultures) were used to the concepts of gods (or a god), the Council decided to make Issua a demigod, removed all reference to his marriage to Miryam and portrayed her as a repentant prostitute, and because Rome and the theocrats were male, deliberately excluded the three gospels written by women, and created the myth of the virgin birth.

To ensure that this new religion became the official and only verson and the books chosen by the Nicean council were the only accepted "canon", one Bishop Athanasius decreed that all non-canonical books and gospels to be found and destroyed, and decreed that anyone who did not accept the 'new' religion were to be killed. Thus, the Roman Catholic Church was born.

Much of the original teachings of Issua did survive the pogrom of destruction, with several sub-sects going to great lengths to hide un-edited copies away from the long arm of Constantine and his pet bishopric. Today, the purest form of the original teachings of Issua can be found in the Egyptian Coptic Church. The oldest surviving texts of the first gospels are part of the Dead Sea Scrolls, or the Nag Hammadi Library, and translations show an almost perfect match with today's Coptic and surviving Gnostic gospels. Much to the Roman Catholic Church's chagrin, the rulers of Egypt at the time decided not to hand over these holy texts, but keep them temselves as a national historic treasure available to all scholars, instead of being locked away in the Vatican's library and out of reach of all but the highest-placed clergymen of the See.

Issua is, of course, better known as Jesus Christ, aka Jesus of Nazareth, and his wife Miryam is otherwise known as Mary Magdalene.