Thursday, 14 December 2017

Star Wars: The Last Jedi


Off to the cinema to see Star Wars: The Last Jedi, the latest instalment in the long-running Star Wars space saga. You wait thirty-two years for a Star Wars film and then three come along at once. To paraphrase Howard Hawks, The Last Jedi has three good scenes and lots of indifferent scenes. It's overlong and repetitive and, as is the fashion with modern Star Wars films, it feels like a combination of bits from other Star Wars films. Did I enjoy myself? I did, although I would have enjoyed myself slightly more if the film had been half an hour shorter.


Now that Disney has its hands on the franchise the films have become an annual event. Next year there will apparently be a film about space rogue Han Solo. The year after that, who knows? Something about the bounty hunters, probably. Or a variation of Fifty Shades of Grey starring sadistic torture droid EV-9D9, and hopefully Maggie Gyllenhaal. Or the swashbuckling adventures of Lobot, which will be both an action film and a poignant exploration of autism. I don't know. I don't work in Hollywood.

I saw the film at London's The Science Museum. The Last Jedi was shot on good old-fashioned 35mm film with 70mm IMAX inserts. The Science Museum is the only place in the UK and perhaps all of Europe screening an actual film print. I went into the film without preparation. I haven't see any of the trailers or read any of the reviews. The film was released in the UK on 14 December 2017. I saw it on 14 December 2017. I was of sound mind and body.

For Force and Rogue the Science Museum introduced the film with cheaply-animated graphics of Darth Vader and Chewbacca. This time they have persuaded one of the staff to do a little filmed introduction in which he is strangled by Darth Vader. Mid-way through the screening the fire alarm went off, and we all walked out into the street, and then we walked back into the theatre and the film resumed. The end.

I've written about Star Wars before. The first film, just called Star Wars, was released in 1977. It was an enormously popular space adventure notable for its striking special effects and its sincerity. Although director George Lucas was an arty film school hipster, he treated Star Wars as if it was a real film, like Lawrence of Arabia, even though it was set in space and had laser swords and spaceships and robots. Audiences worldwide were eager to be distracted from Gerald Ford and punk music and William Friedkin's Sorcerer so they lapped it up.



The long-awaited sequel, The Empire Strikes Back, was released in 1980. Although Lucas took the risky decision to finance it from his own pocket he didn't compromise his artistic vision; Empire had a peculiar structure and a downbeat tone. Our heroes spent the whole film running from one catastrophe from another. The film ended on a bleak note, with one of the heroes maimed both physically and mentally and another imprisoned in a block of stone.

On both an artistic and technical level Empire remains the high point of the series and one of the best science fiction adventures of all time. Even today it looks and sounds awesome, all blue and orange, with cool stop-motion models and a rocking soundtrack by top orchestral mastermind John Williams. It has class; very few films have class.

The original film trilogy came to an end in 1983 with Return of the Jedi, which threw all the flaws of the Star Wars series into stark relief. The first film had been assembled from things that inspired George Lucas. There were bits of Dune, Buck Rogers, Dambusters, Japanese Samurai films, old westerns, Lensman and so forth. The sci-fi treatment felt fresh and new, but by the time of Return of the Jedi the series had begun to cannibalise itself. Jedi is by no means a bad film; at the very least it resolved the original trilogy in the most efficient way possible that didn't result in people asking for their money back.

The fundamentally derivative nature of the series hurts The Last Jedi. Instead of drawing inspiration from outside the series, the filmmakers have remixed a collection of elements from the original films and from the flood of media that followed them. Yet again there is an evil superweapon which, yet again, has a weak point. For what must be only the second or third time, but feels like the millionth, our heroes easily infiltrate a heavily guarded military base by wearing captured uniforms. Again, the good guys attack enemy vehicles that can only fire forwards by approaching them directly from the front, instead of for example the sides. There is a technical problem that can only be resolved by plugging in some fuses, or opening a circuit board, at which point a door opens. Our heroes run for safety towards a spaceship which is blown up just before they reach it. And so forth.


The film picks up the story from The Force Awakens, which was released two years ago. I remember being impressed that it didn't suck. The new young cast could have been irritating but were instead charismatic, even Daisy Ridley with her plummy BBC English accent, and BB-8, the cute new robot. The treatment of homosexual love between dreamy space ace Poe Dameron and reluctant Imperial Stormtrooper Finn was sweet; the characterisation of the chief villain was unusually complex for a Star Wars film, although the absence of a truly hissable baddie left the film's drama feeling surprisingly low-stakes.

On the other hand John Williams' score had one good new theme but was otherwise weak, and the plot felt like a rewrite of the original. Two years later I barely think of The Force Awakens, but then again there are very few things from 2015 that I think about, indeed I can barely remember 2015. It was the year in which no-one died. Sometimes I worry that all the sealed Force Awakens merchandise I bought might not pay for my retirement after all. The crate of Sphero BB-8 toys in particular cost a fortune. What if the batteries wear down? I'll have to pay someone in China to make new batteries.


The Force Awakens was overshadowed by last year's Rogue One, which was a prequel that filled in some of the storyline from just before Star Wars, using CGI to recreate some of that film's original cast, in the process returning Peter Cushing to the silver screen over twenty years after he died. Now that Christopher Lee is dead it would only require some deft legal manoeuvering to reboot Hammer's Dracula films with CGI versions of the original cast, perhaps including a CGI Ingrid Pitt, who is also dead. Madeline Smith and Gabrielle Drake are still alive. I hope there is a CGI model of them somewhere. I would pay money to borrow it.

Before you write in to complain, I am fully aware that Ingrid Pitt did not appear in any of Hammer's original Dracula films. I merely hope that some part of her is preserved so that she may continue to entertain audiences forevermore, even as whatever remains of her soul begs for the sweet release of oblivion. This may seem cruel and self-centered, and it is, but if this world was not created for my amusement, what was it created for? Or is the universe merely a byproduct of physical processes, created by no-one, for no purpose? Write your answers on a piece of paper and then throw it on the fire, because whatever answer you chose was wrong because there is no answer and we are all doomed.

The Force Awakens was overshadowed by last year's Rogue One, which despite production problems that resulted in a sometimes disjointed narrative - and another weak score, recorded in a rush when the original composer had to drop out - was possessed of some gripping and surprisingly brutal action sequences. The decision to make a darker film than its predecessor seemed self-conscious, and the two lead heroes were a bit dull, but overall the two films distracted me from the horror of life for a few hours, and for that I am grateful, but also resentful because they gave me hope in a world where hope is a lie.


But what about The Last Jedi? Is it any good? Is stoic hero Luke Skywalker a virgin? Sadly the film doesn't answer that. Did the audience applaud when space-princess-turned-military-commander Leia Organa appeared on the screen? No, they didn't. Leia is played by Carrie Fisher, who died almost a year ago to the day. The main credits end with a dedication to her; the audience applauded at that point. 2016 was the year in which everyone died, culminating in George Michael (25 December), Carrie Fisher (27 December) and Fisher's mother, Debbie Reynolds (28 December).

At the time it felt as if 2016 was Muhammed Ali and we were Sonny Liston; the year knocked us down in round one and then dared us to get up so it could knock us down again. Ali himself died in 2016. Sonny Liston was lucky. He died in 1970. If he had lived until 2016 he would have died as well, so it's perhaps lucky that he died earlier. The people who made The Last Jedi have access to a CGI model of Carrie Fisher - it was deployed briefly in Rogue One - but they have promised not to use it. Nonetheless it sits on a hard drive somewhere. Waiting.

Back to the film. Without wishing to spoil it, The Last Jedi borrows an awful lot from Empire. It begins and ends with an evacuation against seemingly impossible odds. The middle section has a training montage in which Daisy Ridley's Rey apparently teaches herself how to be a Jedi Master, while Luke Skywalker moans a lot; Yoda makes a cameo appearance, here rendered with CGI that's supposed to look like a foam puppet.

However it's not all Empire. The middle section also has a bit of James Bond with John Boyega's Finn and Kelly Marie Tran's Rose, a spunky space mechanic, who infiltrate an alien casino. This is, incidentally, when the fire alarm went off, so I missed a teeny-tiny bit of the action. The film has a bit of politics at this point. Our heroes turn into Jeremy Corbyn and decide to liberate the serfs and their livestock, although it's plainly obvious that just after our heroes escape the livestock is either rounded up or killed and the serfs are put back to work. A short scene at the very end of the film suggests that the serfs were however inspired by Finn and Rose, so that's okay.

I have to wonder. Do Islamic terrorists see themselves as freedom fighters? Do they see us as the evil empire, and our society as a den of rich parasites? Are they are in fact morally right, if legally wrong? As before, if you have any answers, keep them to yourself or write them on a piece of paper and burn it.


This section of the film has one of the three good scenes that I mentioned. A short but exciting chase on the back of an alien horse. What are the other two? There's a very short fight involving Luke Skywalker, in which it becomes apparent that all is not what it seems - the audience applauded this part, and I was impressed with its chutzpah. The highlight however is an extended light sabre battle involving a pair of unlikely allies. There's something audacious about it because it materialises out of thin air. It was the film's only punch-the-air moment.

Sadly however the second half of the film bogs down. The Rebel fleet becomes involved in a long chase with an Imperial squadron that seems to go on forever. Poe Dameron leads a rebellion against the Rebellion that goes around in a circle and leads nowhere. Along the way Laura Dern single-handedly destroys an Imperial Star Destroyer with a tactic that made me wonder why it hadn't been tried before; I found myself wondering why they gave her purple hair, and what happened to Laura Dern anyway? She was in Wild at Heart and Jurassic Park, and then seemed to fall down the same hole as Juliette Lewis and Mary Elizabeth Mastrantonio.

By the final battle, which again borrows a lot from Empire, I found myself becoming bored. There are only so many times you can watch a bunch of attack craft speeding towards another bunch of vehicles before your brain starts to melt. My hunch is that the director had more of a handle on the physical action than the space battles. The fight scenes are exciting, the space battles dull, except for one short sequence in which the Millennium Falcon takes on some TIE Fighters and leads them through a crystalline tunnel. This was the film's fourth good scene. It was obviously a homage to a similar sequence in Return of the Jedi, but it worked.



Overall The Last Jedi passes the time but suffers badly from padding, especially in the second half. If every scene involving Laura Dern was cut it would have been slightly better. Not because Laura Dern is bad but because her "arc" is compartmentalised and pointless. The end.

What else? The toys this time are called Porgs. They're little bird things. At one point the chief villain uses the word spunk, in the old-fashioned way; the audience laughed. The film has a short cameo from Benicio del Toro, who plays a wasted variation of Han Solo. He's terrific but sadly only it in for a few minutes. Once again Gwendoline Christie is completely wasted behind a metal mask, although as before it's ambiguous whether she dies or not. A sequence in which a beloved main character survives certain death by floating through space is either heartwarming or ridiculous depending on how drunk you are. Andy Serkis is terrific as a motion captured CGI character, this time the evil Supreme Leader Snoke. I was surprised to learn that he voiced the character as well. His performance - with lots of close-ups of leering and bad teeth - is probably the best acting in the film. Lupita Nyong'o has a one-scene cameo that was presumably filmed in a shed somewhere. I missed her; her character in Force was entertaining.

Neither Mark Hammill nor Daisy Ridley can act, in a conventional way, but they both have charisma and Daisy Ridley has gusto, so I don't mind. Do you remember Anjelica Huston? She was a better actor than Daisy Ridley, but she didn't have charisma, so no-one remembers her nowadays. The cast of Star Wars were, for the most part, very limited actors, but they had charisma, and that goes a long way. Half of them were acting behind masks and they had more charisma than Anjelica Huston.


The Star Wars films take place a long time ago in a galaxy far, far away, which means that the writers have to be careful with language. The characters can't talk about miles or kilograms or hours or New York, because those things don't exist in the Star Wars universe. In this film a character uses the word "god", and another character uses the word "bastard", which surprised me; the Star Wars universe is usually very po-faced. A gag in which one character pretends to have a bad mobile phone conversation with another character feels un-Star Warsy, and Yoda's cameo involves what may or may not be a metatextual dig at the masses of Star Wars books and merchandise that has appeared since 1977, or might not. I don't know. I just don't know, the end. Until next year.

Wednesday, 6 December 2017

The Night Before the Death of the Sampling Virus



Let's have a look at Otomo Yoshihide's The Night Before the Death of the Sampling Virus, a fascinating CD that came out in 1993. It's a collection of 77 disjointed snippets of noise, some of which are only a few seconds long. In the liner notes Yoshihide requests that you listen to the CD in shuffle mode, or alternatively that you listen to several copies of the record at the same time, which sounds like a clever ploy to sell more records. He also suggests that you smear grease on the disc, but I didn't do this.

I only have one copy of Sampling Virus, so I used a computer to shuffle the tracks. Then I layered them four times, thus:


Whilst that sweet sound assaults your ears contemplate all the music you might be listening to instead. Sampling Virus suffers from a technical problem whereby compact disc players can't seek instantly, so no matter how it's sequenced it always sounds like a series of disjointed sound snippets. A few years later Autechre, working under the name Gescom, released Minidisc, a minidisc that exploited the format's gapless playback. I haven't heard it. The idea sounds a bit naff nowadays. It's probably the kind of thing that was envisaged by the man who devised the CD specification way back in the late 1970s.

Extreme Records still exists. It has nothing to do with Hungary's Arrow Cross Party, despite having a similar logo.

Sampling Virus was released a few months before Meat Loaf's Bat Out of Hell II and was largely overshadowed by that record. I remember hearing Meat Loaf a lot more on the radio. As a consequence it didn't chart. Is it glitch music? I'm not sure. Some of the lengthier tracks have a glitchy sound, but I think they were edited manually with a sampler. The idea of randomly ordering fragments of music isn't really glitch music, it's chance music, which is something else.

I've always thought of glitch as a musical form that exploits the technical errors of musical equipment, as in Oval's Systemisch (1994), which uses CD-skipping sounds. Sampling Virus is essentially Japanese noise music with a chance element.

Will I ever listen to it a second time? Unlikely, but there are few people on this planet who can truthfully say that they have listened to Sampling Virus all the way through, and I am one of them. Last year I wrote about Touch Records' Ringtones, a collection of audio snippets intended for use as mobile phone ringtones - the album was released about eighteen months before audio ringtones took off, and is interesting now for being slightly ahead of the curve. Ringtones has 99 tracks, and although it was never intended to be played in shuffle mode, or even listened to as an album, it works equally well as an accidental sequel to Sampling Virus. Just for fun I decided to apply the same treatment to Ringtones that I applied to Sampling Virus up the page:



Has there ever been a genre more ripe for commercial discovery than Japanese noise music? Katy Perry's most recent album, Witness, has been relatively unsuccessful, but she has a knack of bouncing back from adversity with a new sound. What better seam to mine than Japanoise? Few genres encapsulate modern life more perfectly than Japanese noise music, and where Katy Perry leads, we will follow. Glitch music itself almost threatened to break into the mainstream a few years ago, and although the likes of Merzbow and KK Null are not mainstream figures they could probably sell out the Royal Festival Hall, so Katy Perry should have no difficulty bringing Japanese noise to the world's arenas. She has something that Merzbow doesn't have; attractive breasts.

If you think about it, choral music is a kind of glitch music. The vocal texture of a choir comes from the layering of different voices; if all the voices in a choir were identical, the result would sound like a very loud soloist. Even highly trained vocalists can't produce pure tones, because the human animal is much less precise and repeatable than a machine. We don't mean to be different, we just are.

Wednesday, 22 November 2017

Apple Power Macintosh G5: Flame On


Let's have a look at the Apple Power Macintosh G5, a weighty space heater that can also perform computing tasks. Apple launched the G5 in 2003 with great fanfare, but nowadays it has a decidedly mixed legacy. In 2003 it was a desktop supercomputer that was supposed to form the basis of Apple's product range for years to come, but within three years it had been discontinued, along with the entire PowerPC range, in favour of a completely new computing architecture. The G5 puts me in mind of an ageing footballer who finally has a chance to play a World Cup match; he is called up from the substitute's bench, entertains the crowd for twenty minutes, but the team loses, and by the time of the next World Cup the uniform is the same but the players are all different. Our time in the sun is brief, the G5's time especially so.

I've long been a PC person, and from my point of view the G5 came and went in the blink of an eye. I knew that it had a striking case and a reputation for high power consumption and heat output, and for being 64-bit at a time when that was rare in the PC world, but that's about it. Almost fifteen years later G5s are available on the used market for almost nothing - postage is incredibly awkward - so I decided to try one out. Mine is a 2.0ghz dual-processor model, the flagship of the first wave of G5s. Back in 2003 this very machine was, in Apple's words, "the world's fastest, most powerful personal computer".

Before turning it on for the first time I informed the local flying club that I was about to activate a powerful radio frequency source. I sent a letter to the Home Office and British Telecom and my electricity provider to inform them that I was not a terrorist, and that the souls of the dead are reincarnated on Jupiter, and that I no longer wished to be married to Helena Bonham-Carter. Furthermore I removed all of my clothes out of fear and assumed a defensive posture, attempting to prove to any and all observers that I was no harm to anyone.

I was slightly disappointed when the machine started not with flames but with a muted whoosh, which settled down into a quiet hum. Nothing exploded or shuddered. Masked men did not burst into the room. Instead there was a familiar chime and OS X 10.5.8 loaded up.

My G5 still had its original 160gb 7200rpm hard drive, which was very noisy. The date code says that it was constructed in August 2003.

I have a gadget that can measure the power consumption of electrical items. My desktop PC is a quad-core 64-bit i5-2500k running at 3.3ghz, with an SSD, two spinning hard drives, and an Nvidia GTX 750. I built it back in 2011, and apart from adding the graphics card and SSD I haven't felt the need for more power.

According to my gadget the PC idles at 70 watts and at full whack consumes 150 watts of power. Under load that's about the same as a modern 4K television, perhaps slightly more if I include the PC's monitor, but my PC rarely runs at full power.

The G5 plays DVDs perfectly well, like every desktop computer since the late 1990s. The G5 predated Blu-Ray, but there were a couple of G5-compatible Blu-Ray drives. They were only useful for burning data discs, however, as only the fastest G5s had the necessary combination of processor grunt and graphics hardware to decode Blu-Ray video, and even then finding PowerPC software that would play Blu-Ray films was problematic.

In comparison my new, fourteen-year-old Power Macintosh G5 has two separate 64-bit PowerPC 970 processors running at 2ghz. It has two spinning hard drives and an ancient Radeon 9600. At idle it consumes 140 watts of power - only ten watts less than my PC under load - and when taxed it sucks up 280+ watts. That's almost twice as much as my fridge, and apparently with very heavy processing power consumption goes up to 400+ watts. On the positive side the G5 is quieter than my PC. It has more fans, but they run slower, only going mad every once in a while.

OSX 10.5.8 Leopard was the final version of OSX for the PowerPC-era Macintoshes. Leopard is now ten years old. It's a close contemporary of Windows Vista, but whereas Vista was mocked as a bloated mess Leopard was generally regarded as a decent-albeit-inessential upgrade of 10.4 Tiger. As a PC person I'm impressed with how well Leopard and the G5 have aged. The toned-down look of mid-decade OSX Leopard is still attractive. It recognises USB peripherals without popping open an irritating dialogue box, the interface feels smoother and less flaky than Windows XP on a similarly old machine, and the tablet-esque "cover flow" feature feels ahead of its time.

Cover Flow was included in the first iPhone, released at almost the same time as Leopard in 2007, and although Apple has subsequently fallen out of love with Cover Flow it remains part of MacOS nowadays. Leopard runs an obsolete version of iTunes, version 10, which is faster and easier to use than the latest version.

Back in 2005 there was some debate as to whether Tiger (older, less featuresome, but faster) or Leopard (a few more features, more modern, slower) was the best choice for the Power Macintosh G5. I'm in two minds. My hunch is that the minor performance hit of Leopard is insignificant on the faster G5s, but on the other hand very little OSX software was made obsolete in the jump from 10.4 to 10.5 - Photoshop CS4 and Logic Pro 8 also work on 10.4 - so beyond a feeling of completeness there's no pressing practical need to switch from 10.4 to 10.5.

A pair of 1TB Western Digital Caviar Greens I had lying about. with jumpers over pins 5+6 to force them into SATA-I/II mode. The Intel-powered Macintosh Pro had four drive bays arranged from front to back through the middle of the machine, with "cold swappable" caddies. The G5 isn't quite as elegant - the caddy is fixed in place, and the drives go on backwards, so you still have to plug in the SATA connectors. G5-era OSX had software RAID support, although I suspect an SSD would be more sensible.

I bought the G5 mainly to use Logic Express, a music sequencer. Here's a song I wrote with this combination, as featured in the previous post:


And here's a little video of Logic Express performing the track. A more complex arrangement with masses of reverb would sorely tax the CPU, although it wouldn't be too hard to fix this with creative bouncing:


There was something melancholic about the process of setting up Logic Express. Logic's big selling point is its simple interface and its massive selection of genuinely good built-in sound generators, in particular a decent software sampler that has a range of usable, natural-sounding, but not annoyingly obvious instruments. Logic uses AU "audio unit" plugins instead of the more common VST standard. There was a boom time in the early 2000s when masses of free VST/AU plugins were available, but over the years the market has died off, partially because the audio units included with the modern Logic Pro X are extensive and well-made, partially because the developers have moved on.

So there was something sad about hunting down old plugins that were last updated in 2007, hosted on personal websites that died in 2011, or that remain as shells with (c) 2005 dates on them. It's as if creative electronic computer-based music flourished in the early part of the 2000s and then died suddenly, which is surely not the case, but browsing through dead links gives that impression. HyperUPIC and Sonasphere, for example, appear to have completely left the internet, never to return. Fortunately MDA and DestroyFX still exist. Perhaps I'm out of touch.

The only PowerPC browser actively maintained today is TenFourFox, which is based on FireFox. On my 1.67ghz PowerBook G4 it's painfully slow, but it's much more usable on a G5. It even accesses the web version of Google Drive without grinding to a halt. It has very limited support for online high-def video and Netflix is a distant dream, but on the whole it makes the G5 almost a usable everyday machine, especially if you own a solar power plant. I imagine the late dual-core and quad-core G5s would be pretty speedy.

The problem of course is that the same could be said of almost any cheap laptop or Windows tablet released during the last ten years, minus the bit about having a solar power plant. The laptop would probably have more USB ports, and might well have USB 3, which would compensate for the G5's second drive bay. Anandtech made this very point when they tested one of the later G5s against a 2010 Mac Mini, which was generally faster while using roughly one-tenth the power. My late-2008 MacBook Pro, for example, is slightly less flexible than the G5 but runs MacOS High Sierra and is much faster.


Fourteen years later the exterior of the case is still stunning. The interior has however tarnished a bit.

I decided to benchmark my G5. I ran the trial version of Geekbench 2, an older benchmarking utility that runs on different platforms. Geekbench is as old as the G5, and in a neat coincidence it uses the lowest-specification Power Macintosh G5 as its benchmark, with a score of 1000. Perhaps the author wrote the original version on a G5. I don't know.

My 2.0ghz dual-processor G5 scores 1645, which makes it 60% faster than the 1.6ghz entry-level model, or at least the benchmark score is 60% higher. That's reasonable given that the 1.6ghz model had a lower clock speed, slower memory, a lower bus speed and only one processor.

The G5 again. It was divided into three thermal zones. The front of the machine is to the left. From top to bottom, left to right, the top compartment has a DVD drive, a pair of fans, and two hard drives. The middle compartment has a fan plus mono speaker mounted in front of a 56K modem, followed by space for PCI-X cards and the AGP graphics card. At the rear of the machine is a catch that releases the access panel.
The lower compartment has the memory sitting beneath a Wi-Fi/Bluetooth card, then the CPU modules and their heatsinks, then another pair of fans. The base of the case contains a hefty 600w power supply.

The 17" 1.67ghz PowerBook G4 I wrote about last month scores 883, drawing just 47 watts whilst doing so, which suggests that the 1.6ghz Power Macintosh G5 wasn't much cop. Putting it another way, my G5 draws six times more power than a contemporary G4 laptop, but benchmarks only twice as fast. I realise I'm comparing two different fruit, but the G5 feels like an attempt to achieve performance gains with brute force rather than sophistication. It also feels like the result of two companies with different product release schedules trying to reach an uneasy compromise. The G5's deficiencies were masked by the fact that contemporary PCs were just as power-hungry, but therein lies a history lesson.

In the 1980s and 1990s home computers didn't use much electricity. No-one cared about "thermal design power" and most computers were either air-cooled or used a single fan to push air over the power supply unit. By the 2000s however heat became a major issue. Intel's Pentium 4, introduced in 2000, was a small step in terms of performance improvements over the Pentium III but a giant leap in heat output. Even without overclocking the typical Pentium 4 system required a PSU fan, a CPU fan, a case output fan, perhaps input fans and a fan on the graphics card. I call the early-mid 2000s the time of nine fans.

Eventually the Pentium 4's design team reached something called the "power wall", whereby the gains from extra clock speed were outweighed by the difficulty of cooling the chip. Furthermore a phenomenon called electromigration, whereby circuitry degrades at higher temperatures, started to eat into the lifespan of the chips. This is one of the reasons why modern CPUs tend to use multiple, modestly-clocked cores rather than one single very fast processing unit.

The Power Macintosh G5 did actually have nine fans. My Power Macintosh G5 has nine fans. Two are hidden away in the base of the machine, where they are attached to the power supply unit. Four fans front and back draw air over the gigantic CPU heatsinks. There's a single fan in the middle, which airs the PCI cards, and two small fans blow air over the hard drives and rear of the motherboard.

That's nine fans. I've counted. Two plus four (six) plus one (seven) plus two (nine) equals nine. The case has an array of temperature sensors that make sure everything is cooled effectively. Each machine apparently has a unique thermal profile stored somewhere in its BIOS. Some modifications require that the thermal profile is recalibrated before the fans work properly again, and you can only do that with Apple's G5 diagnostic tools, which aren't publicly available.


Photoshop CS2 - technically Bridge - plus TenFourFox. In this shot I've had to use Adobe's DNG converter to convert my camera's RAW files. Good luck finding the last version of Adobe's DNG converter that supports the PowerPC! The G5 will run up to CS4, which is still competent nowadays.

The G5's case is a clever piece of design that works well, but there must have been a better way. Imagine if the time and brainpower spent dealing with the G5's heat generation had been applied to other problems instead. Back in the 1990s the PowerPC chip was touted as an efficient, RISC-based alternative to the Intel 80X86. It ran at lower clock speeds than contemporary Pentiums but did as much work. Successive generations of the PowerPC chip kept Apple Macintoshes competitive during the late 1990s, but the architecture started to lag in the early 2000s with the introduction of second-generation Pentium 4s and efficient X86 clones from AMD. The G3 and G4 remained competitive mobile chips but Apple was in danger of having its desktop machines fall behind.

The PowerPC G5, formally known as the IBM PowerPC 970, was Apple's great white hope. It was announced at Apple's 2003 keynote presentation, which is available on Youtube:


The keynote is like something from a parallel world. Nowadays Apple's product announcements are full of 3D face recognition and all-glass backing and dual-lens cameras and rose gold; in 2003 the company chose to highlight the G5's bandwidth and bus speed and its advanced chip fabrication technology. Nowadays Apple doesn't talk about cost - if Sir or Madam baulks at the price of a new MacBook Pro, perhaps Sir or Madam might consider going elsewhere - but in 2003 the G5 was sold as a cheaper alternative to an equivalent dual-Xeon PC. There was also a rackmounted file server version of the G5, the XServe, which is something the modern Apple would never dream of releasing. This was a time when Apple was fond of pointing out the UNIX roots of OS X.

On paper the G5 looked terrific. The PowerPC 970 was a 64-bit chip attached to a system bus that ran at lighting speed, with a multi-processor-enabled architecture that could access up to 8gb of memory, with SATA hard drives and an awesome case. All except the most basic Power Macintosh G5 machines had either two CPUs or a dual-core chip - in one case two dual-core chips - and later machines increased the memory limit to 16gb, with the last batch of G5's adding support for PCI-e. Even in 2017 the idea of a quad-core desktop PC with 16gb of memory plus SATA and PCI-e sounds current.


The G5's considerable weight is focused on these little pads. There were aftermarket cork pads, but I've used masking tape to wrap a pair of old cycle gloves around the handle-stands, which doesn't change the fact that in 1998 The Undertaker threw Mankind off Hell In A Cell, causing him to plummet sixteen feet through an announcer’s table.

As mentioned earlier Apple's television adverts claimed that the G5 was "the world's fastest, most powerful personal computer", although here in the United Kingdom the ITC objected to that claim and forbade Apple from repeating the ad. The single-processor machines weren't particularly impressive, but the G5 was new and hopefully had room for expansion whereas the Pentium 4 and Xeon were in 2003 several years old.

But that seems to be where things went wrong, because the G5 didn't have room for expansion. Or contraction, because no matter how hard IBM tried they couldn't produce a chip that would fit into a laptop. The PowerPC 970 was simply too power-hungry, and when underclocked it didn't run much faster than the G4, and the 64-bit architecture was of questionable benefit in a mobile context. Many years later ARM demonstrated that it was possible to make incredibly frugal, powerful mobile RISC chips, but that was still science fiction in the early 2000s.

The G5's 64-bit architecture was something of a false start. Only a handful of applications used the G5's 64-bit address space, and although OSX could access huge amounts of memory it didn't have a 64-bit kernel for several years. When Apple abandoned the PowerPC they temporarily took a step back into a predominantly 32-bit world with the Core Duo, only fully embracing 64 bits a few years later, with the Core 2 Duo and OS X 10.7.


Logic Express is essentially a multi-track audio / MIDI sequencer on a G4 PowerBook. On a G5 however it will run lots of instruments and effects at once.

The G5's lack of mobile mojo was unfortunate in a world that was gradually pivoting towards mobile computing and mobile internet, doubly so given that half of Apple's profits came from its laptops. It's fascinating to speculate whether IBM's inability to make a mobile G5 was a result of its inability to do so or from a lack of motivation. In the past IBM had made mobile versions of the 80386, and it had even sold RISC-powered versions of the ThinkPad laptop, but that was a long time ago, and IBM had no other use for mobile chips. In the early 2000s Apple had turned the corner into profitability but from IBM's point of view Apple was still just another customer, of relatively minor importance. IBM might have spent a fortune setting up a dedicated POWER mobile team, but to what end?

And there was something else. Intel had launched the Pentium 4 with great hopes that it would still be around in ten years, but development hit a brick wall in the early 2000s, and for a brief period AMD seemed poised to become the dominant player in the X86 market. Intel's problems with the hot, power-hungry Pentium 4 mirrored those of IBM with the G5, but whereas IBM was uninterested in a mobile G5 Intel made three concerted attempts to stuff the Pentium 4 into laptops, failing each time. To its credit Intel wasn't too proud to admit defeat. After going back to the drawing board the company came up with the Pentium M, which was released to the world in 2003, just as Apple was putting the finishing touches on the Power Macintosh G5.

The Pentium M was perhaps the most influential CPU of the early 2000s. Very few people recognised it at the time. With the exception of a few small-form-factor PCs it was generally fitted into laptops, which were of limited interest to performance enthusiasts. Laptops had a reputation for being underpowered; no-one in 2003 expected that the Pentium M would be any good. Its name was easy to confuse with the earlier Pentium 4M, and furthermore Intel insisted on downplaying the Pentium M in favour of the Centrino platform. Part of this unwillingness to publicise the Pentium M might have come from the fact that the design owed more to the Pentium III than the Pentium 4. I imagine Intel was unwilling to make the Pentium 4 look bad given that it was still theoretically their desktop flagship.

But not for long. The Pentium M didn't just outperform the Pentium 4M mobile chip, it also benchmarked within a few percent of the desktop Pentium 4, while consuming less power and generating less heat. After a brief diversion with the Pentium D Intel essentially gave up on the Pentium 4 in favour of a multi-core development of the Pentium M, which was sold as the Core architecture. The mostly-mobile Core Duo and desktop-oriented, 64-bit Core 2 Duo went on to re-establish Intel's dominant position in the X86 marketplace.

Apple had, as a side projected, already ported OSX to the X86 architecture. There were rumours that Pentium 4-based development machines actually ran OSX faster than the G5. At some point Apple's engineers must have become privy to the Pentium M development roadmap, and in mid-2005 Apple publicly announced that it was saying goodbye to the PowerPC architecture in favour of Intel.

Given the G5's notorious heat issues the switch to Intel was less of a shock that it might have been. I have the impression that long-term Apple fans are fond of the PowerPC era and nostalgic for the likes of the dual-processor, mirrored drive door G4, but not blind to the G5's faults. Apple fans aren't like Amiga fans, thank goodness. They know when to admit defeat.

The fans pull straight out. Further work generally isn't necessary - it's easy enough to blow dust out of the heatsinks, and the airflow tends to keep the G5's interior surprisingly clean. Most other faults are terminal and can be fixed by throwing the G5 into a deep bog and buying a new one instead. The single-processor models just had the top heatsink and fan. The liquid-cooled models enclosed the CPUs and cooling unit in a single large block.

The Pentium M's life ran alongside the PowerPC 970/FX used in the G5. They were both launched in 2003 and ended their lives in 2005. During that period the Pentium M underwent a die shrink and scaled from 1.3ghz up to 2.27ghz, roughly doubling in performance in the process. The PowerPC 970 also underwent a die shrink, but its performance increases were more modest until the very last wave of multi-core G5s, which were impressively fast but not enough to change Apple's mind.

The first wave of G5s consisted of a 1.6ghz entry-level model, a 1.8ghz also entry-level model, and a 2ghz dual-processor flagship. The second wave, launched in mid-2004, introduced the more efficient 970FX processor but was otherwise very similar to the first wave, with a 2.5ghz model sitting at the top of the range. Performance-wise the second wave seemed to be only slightly faster than the first. A third wave came out in mid-2005, but again the machines were much the same as their predecessors. The last batch of G5s emerged in October 2005 and introduced dual-core processors and PCI Express ports. They were launched when it was already known that Apple was going to abandon PowerPC and were therefore doomed to be the last of the line.

Apple also launched a couple of orphan G5s - a 1.8ghz Dual Processor machine that filled out the bottom of the range, and a single-processor 1.8ghz model that used iMac components in an attempt to sell a budget model. More than half of the fourteen different G5 models ran at 1.8ghz or 2.0ghz. The final, dual-core 2ghz model was only slightly faster than my first-generation 2ghz dual-processor G5; the last 1.8ghz model was actually slower than its predecessors. Meanwhile the later high-end models needed liquid cooling units to tame their incredible thermal output.

Within a few years some of the liquid cooling units developed leaks that could silently corrode the machines away from the inside. Some units were more reliable than others, but nonetheless the effect on the resale value of liquid cooled G5s was dramatic. If you don't want to bother with liquid cooling the most powerful non-liquid G5s are the third-wave dual-processor 2.3ghz models and the 2005 dual-core 2.0ghz and 2.3ghz models, of which the dual-core models are the most desirable due to the inclusion of PCI-express.

The access panel is a rigid, weighty chunk of aluminium. If the entire G5 run had fallen through a timewarp to Nazi Germany circa 1941 the scrap aluminium could have filled the sky with Messerschmidts.

Upgrading the G5 is generally easy. In ascending order of difficulty, easy first:

Memory
All but the most basic models had eight RAM slots, which accept memory in pairs, working from the inside out. They aren't picky; you can mix brands and capacities. My G5 has the two 256mb sticks it was sold with, plus a pair of 512mb sticks, plus two pairs of 1gb sticks for a total of 5.5gb. The later models could accept up to 16gb of memory, although from what I have read advances beyond 4gb in Leopard give only minimal improvement and only then if you're using something like Photoshop a lot. The memory is air-cooled and does tend to get hot. My desktop PC has four slots but two of them are blocked by hard drive cables; the G5's memory is easy to reach.

Storage
All models take SATA hard drives, but the early models were designed for the SATA-I standard. This means that if you use a modern SATA-III drive, you have to put a little jumper over pins 5-6 to enable SATA-I/II compatibility. I have a packet of these little jumpers. I'll paste some of them into the next line so that you can use them:
[[[[[[[[[[[

The G5 is fussy with SSDs, and OS X 10.5 doesn't support TRIM, and even though SSDs are now trivially cheap I had a pair of old Western Digital Caviar Green HDDs sitting about doing nothing, so I used them instead. I cloned the operating system across. My original G5 hard drive is in theory faster than the Caviar Green (7200rpm vs 5400rpm), but it was very noisy, and in my personal experience new slow drives tend to be better performers than old fast drives.

The G5 uses PCI-X slots, which are compatible with original PCI cards. PCI-X was a dead-end standard common in servers; it was obliterated by PCI-Express. In 2017 you will only find PCI-X cards on the used market. My G5 came with a four-port eSATA PCI-X card that will connect with external eSATA hard drives and a Mark of the Unicorn audio interface card that's useless without an external hardware module that I don't have. Most PCI-X cards were for storage, ethernet access, RAID controllers and the like. Sadly there don't seem to be any PCI-X USB cards.

eSATA, by the way, was essentially SATA but for external drives. As with FireWire 800 it was competitive with USB for a while but eventually overshadowed by USB 3.0. eSATA doesn't transfer power, so you can only use drives and drive arrays that have their own power supplies.

The G5 has three USB 2 ports, two FireWire 400 ports, and a FireWire 800 port. In my life I have never used FireWire to transfer data. I will probably go to my grave having never used FireWire.

Video
The initial wave of G5s shipped with graphics cards that had a DVI port and an ADC port. What was ADC? It was a proprietary Apple thing that, like so many proprietary Apple things, was technically clever enough that it didn't seem like deliberate lock-in, but nonetheless didn't even take off within the Apple ecosystem, let alone outside it. Only a handful of Apple monitors supported it and the standard was essentially dead even before the G5 came out. There are ADC-DVI adapters available, but they're too expensive to make sense. The later G5s used graphics cards with dual DVI outputs; the most powerful had a dual-link DVI port that could drive the 30", 2560x1600 Apple Cinema Display.

Early G5s used AGP; later models used PCI-E. My G5 has an air-cooled Radeon 9600. In theory I could upgrade the card, but in practice the G5 only accepts Macintosh-only versions of the various graphics cards that were available, and they're rare on the used market because most of them were sold with the G5 rather than separately. There's not much point upgrading the G5's graphics card unless you want to use dual monitors. OS X might feel slightly snappier. The few games available for the G5 might run faster. I would be wary of the extra heat and power draw.

Everything Else
My G5's optical drive is a bit flaky. Sometimes it reads a disc, sometimes it doesn't. Replacing it is apparently easy but I'm not going to bother. There was a brief period in the early 2000s when it was feasible to back up data to a writeable DVD but with the availability of cheap SD cards and USB sticks there's no point any more.

Fans, brackets, antennae and other components are still widely available. The G5 is in theory entirely replaceable - you can build a new one from spare parts and an empty case, if you have a copy of Apple's thermal calibration software - but there's no point when so many G5s are available on the used market.

As a long-term ownership proposition the G5 is problematic. The fastest models were outpaced by their Intel replacements either immediately or within a couple of years, and are thoroughly obsolete nowadays; the G5's power consumption is a reminder of a time when oil was cheap, interest-only mortgages were a fantastic idea, and the economy was not only going well, it would continue to go well forever. Using the G5 as a file server or overnight rendering machine is an expensive proposition. As a space heater it's less efficient than an actual space heater unless you do useful work with it.

On a more esoteric level the G5's fantastic case can be stripped out and used to house a PC, although it's tricky because the ports and buttons don't conform to the PC standard. This chap here chopped his down and made a cute G5-based mini-PC. Alternatively you could gaffer tape a Mac Mini to the inside and just run all the cables through the cooling holes, using a little stick to press the power button. Two G5s joined with a plank of wood make a neat coffee table. Turned on its side, stacked, and suitability modified the G5 can be used as a chest of drawers. The G5's metal case generally resists corrosion but isn't stainless, so its marine applications are limited.

Seriously though, the G5's aluminium case is now both its greatest strength and its greatest weakness. On the downside it's too huge to send through the post, but on the upside it's a solid, genuinely impressive work of engineering that feels useful as a spare part, if only as an object d'art. Apple is routinely mocked for putting style ahead of substance, but the G5's case is a superb example of a functional design that works well and is also beautiful to look at and indeed think about.

Even with the fan in place there's still a large empty space in front of the memory chips. Enterprising storage vendors sold brackets that could house extra hard drives in this space, although bearing in mind that the memory chips get hot I'm sceptical that it would have been a good idea. I often wonder if Apple could have sold a scaled-down G5 case purely as a robust RAID enclosure, with fans etc.

Back to Geekbench. My 2003 2.0ghz dual-processor G5 scores 1645 while consuming 280+ watts of power. In comparison my late-2008 unibody MacBook Pro laptop, powered by a 2.4ghz Intel Core 2 Duo, scores 2758, drawing 45 watts in the process. By that time the Intel-powered, eight-core Macintosh Pro was Geekbenching values of almost 10,000. My desktop PC, an Intel i5-2500k, Geekbenches at 8238, consuming 150 watts. One criticism levelled at the Intel-powered Macintosh Pro was that despite its class-leading power, the edge it had over ordinary Core 2 Duo / i5-powered Intel hardware didn't justify the cost, but that's another argument for another blog post.

Judging by Everymac's figures the very last, quad-core G5 Geekbenched at 3316, which is very impressive for a machine now twelve years old. I imagine it would still have a niche in a recording studio, if you had the aforementioned MOTU audio interface and the appropriate version of Logic Pro and were so familiar with the workflow that it would be disruptive to change. I shudder to think of that setup's power consumption over twelve years of use.

What's the logical next step from a G5? That's a difficult question. Apple intended for you to replace your G5 with an Intel-powered Mac Pro. The Mac Pro used the same basic case design as the G5, albeit that the interior was rejigged. It was conceptually much the same as the G5, combining multiple processors and multiple drive bays with a plethora of RAM slots and ports. However the switch to Intel coincided with a new appreciation for frugal computing, and many former G5 owners opted for one or more Mac Minis instead, using USB and latterly Thunderbolt for external storage.

The G5-descended Macintosh Pro was discontinued in 2013. Apple then intended for professional users to adopt the next-generation Mac Pro, a tubular monstrosity that defies description, but in practice professionals often switched to 5K iMac, assuming they remained with the Macintosh platform at all.

The problem is that the basic design philosophy of the G5 and Mac Pro - monster processors, tonnes of internal storage, all in a big case - is a throwback to the past, because for all but edge cases standard desktop processors are fast enough and faster ports mean that external storage isn't appreciably slower than internal storage any more. Furthermore The Cloud continually eats away at the idea of a fat client of any kind.


From top to bottom the graphics card has DVI and ADC ports. Then there are holes for the wi-fi and Bluetooth antennae, although my machine connects to the internet without them. Then SPDIF, line out, line in, USB 2, Firewire 400, Firewire 800, Ethernet, Modem. The small front panel has the power button, a headphone port, USB 2, Firewire 400. I've plugged in a USB hub, because three USB ports isn't enough.

Nowadays the G5 is a magnificent example of excess. The Wild Bunch of Sam Pekinpah's classic Western "came too late and stayed too long"; the G5 came too late, but with a lifespan of only three years its time was brief.

And gone forever, because fifteen years later the public's appetite for electricity-guzzling computers is about as great as that for petrol-guzzling cars, e.g. nil. Some G5s probably soldier on in recording studios, and if you happen to be given one for free and you're willing to leave the television turned off and never use the oven it's a perfectly usable albeit very slow desktop computer. It's the cheapest Macintosh desktop tower that's still generally usable.

But even a cheap Intel Atom-powered Windows tablet outperforms it, and once you get bored you face the difficult prospect of selling it on again. If you live near a small airfield they might be able to use it as means of de-icing aeroplanes. When I tire of mine I will offer it to the Royal Navy as a potential replacement for their amphibious assault craft. On the one hand aluminium has a tendency to melt at high temperatures, but on the other hand the RN is strapped for cash, besides which ships have access to huge amounts of seawater, the end.

Sunday, 12 November 2017

The Asteroid Races


Power Macintosh G5 + Logic Express 9 + twelve megawatts of electricity

Here's some music recorded with an ancient Power Macintosh G5. Mine is a dual-processor, 2ghz model that was launched in 2003. I'll write about the G5 in the next post, but for the moment savour its extraordinary case, which is eighty feet tall and requires the use of a stepladder to change the hard drives, each of which is the size of a car:


In its day the G5 was undeniably powerful, but it left a bad taste in the mouth. The RISC-based PowerPC was launched in the 1990s as an efficient, streamlined alternative to Intel's X86 architecture. And yet all of a sudden Apple was selling a scaled-down IBM POWER4 server in a case that sucked electricity and spat flame and wasn't very good as a server. Within three years Apple abandoned the PowerPC architecture entirely, never to return. It was a different time, a different era, with a different future not meant for the G5.

The last time I made some music with an old Macintosh I had more screen real estate, because it was a laptop plugged into an external monitor. I used the 667mhz PowerBook G4 pictured here as essentially a MIDI sequencer / multitrack tape recorder, because it was too slow to play more than one virtual instrument at a time.

However the G5 is a lot more powerful and has five and a half times as much memory...

... and so with the exception of a handful of notes (played with Korg's PolySix VST, and MTron, on a ThinkPad X61) and of course the drum loops, the G5 generated everything.

Sunday, 29 October 2017

Bologna: My Little Pretty One


Off to Bologna, where for a week my life was a waking nightmare, because I was unable to stop singing The Knack's "My Sharona" to myself. It rhymes with Bologna and it's a catchy tune. In theory I shouldn't remember that song - The Knack didn't sell any records in the United Kingdom - but thanks to the seductive, steamroller power of American culture my memories have long since been erased and replaced with those of an American. He is roughly my age. He is inside me.




Do you know Power Pop? The Knack was a power pop band. In the 1960s pop music fragmented into multiple strands, and by the 1970s there was a sharp divide between bubblegum pop music for kids and serious rock music for slightly older kids.

Power pop was a new name for Beatles-esque pure pop music aimed at a more discerning audience. It was catchy, jangly, generally upbeat, often wistful. It attracted an audience of people who were nostalgic for the pre-psychedelic Beatles; people who were curious to see what The Beatles might have produced if LSD had not led them to experiment with trumpets and Mellotrons.





It attracted a small audience, because despite having a commercial sound Power Pop was never a huge commercial force. Big Star famously didn't sell any records; Badfinger charted, but a terrible management contract ultimately resulted in two of the bandmembers committing suicide. Later in the decade the likes of Tom Petty and Jonathan Richman took several years to build a following - Richman never crossed into the mainstream - and although there were elements of Power Pop in the work of Elvis Costello, Joe Jackson, The Cars, Blondie and so forth, those musicians were generally thought of as post-punk/new wave acts.

Power Pop never died out, but lacking a definitive style it became harder to distinguish Power Pop from other strands of music in the 1980s and beyond. Furthermore Big Star's legacy was such that otherwise un-power-poppy bands were influenced by them specifically rather than Power Pop itself, with the result that The Bangles, The Pixies, Ash, Supergrass, The Boo Radleys, Weezer etc all had power pop elements without ever really being Power Pop.





"My Sharona" is five minutes long and has an awesome guitar solo. Bologna however is a city in north-east Italy, notable for its university, its covered walkways, and for being sandwiched between Venice and Florence. A couple of years ago I discovered W. Cope Devereux's Fair Italy, a travel book written about Devereux's trip to Italy in 1884 with his wife. Of Bologna he writes that:

"At Bologna we had an opportunity of tasting the famous sausage-meat, and found it exceedingly good, the flavour being somewhat like spiced beef. The dogs of Bologna were, I believe, once a celebrated breed, which is now almost extinct. I do not mean by this remark to induce any uncomfortable reflections with regard to the sausages, but I really was surprised that nothing in the shape of a dog made itself visible in this town."




Otherwise Devereux merely passed through. Don't we all.

Bologna's main landmarks are a fountain (which was being renovated while I was there) and a pair of ugly towers. A long shopping street leads up to the towers; a second, slightly uphill street joins it at right angles and leads to the train stationat. Just outside the train station is a pleasant park which had some migrants playing a game of shouting at and fighting each other - probably over a woman, or perhaps the soul-crushing despair of having to sleep on a bench in Italy, with no future and no hope.

One of them greeted me in English, which always throws me. How can they tell? I may be pasty-faced and overweight, but so are Polish people (for example) and Russians. What was the point of talking to me, anyway? I can't take any of them with me to the United Kingdom; they wouldn't fit in my carry-on luggage. A small baby, perhaps, but I have no use for a baby. A woman maybe, but it's inevitable that she would leave me almost as soon as we arrived in the country - I'm not making that mistake twice.

The town itself is a maze of covered streets with a metric tonne of fancy-but-not-posh restaurants; truth be told I had never really thought about Bologna before, but it's a pleasant base for exploring the north-east of Italy. It's low-key, warmer than Milan, less ramshackle than Rome, much harder to find prostitutes than Naples.

Venice is in the north-east of Italy. 




During my stay I popped on the train to Venice and Florence; took the bus to the nearly Lamborghini Museum, read-about-but-didn't-visit the nearby Ferrari and Ducati Museums, also read about and also didn't visit San Marino, a mountainous enclaved microstate south of Bologna, reachable by train and bus. After several days of wandering around in baking heat I decided that I didn't want to walk around a mountain after all.

Bologna is also a train ride away from Rimini, a beach resort. That side of Italy is surprisingly close to Croatia, but sadly the Adriatic sea is in the way.

There were no fascists in the car park. Job done! 



Invader - the Italian man who goes around putting little mosaics high up on walls - has a presence in Ravenna, although as far as I know this particular mosaic isn't his work.

I visited Ravenna, a small town in which Dante Alighieri is buried. I have not read a single word of Dante's writing, and I have no idea why white American teenage boys all want to be called Dante, but I felt I had to go.






Ravenna

I can barely remember the lyrics of "My Sharona". Something about running down the length of my thigh, Sharona, and "always get it up for the touch of the younger kind". The Knack dressed in New Wave clothes but they were approaching middle age, and "My Sharona" was their only substantive hit in their native The United States of America.



It reached number six here in the UK - "I Don't Like Mondays" and "Are Friends Electric" were number one and number two that week - but their debut album only reached number 65, their second single number 66, and they never charted ever again.

And yet because Ronald Reagan and Uncle Sam and the Internet have literally forced American culture into my mind I remember The Knack and not, for example, Dave Edmunds and Rockpile. Curse you, the United States, and your cultural hegemony! And curse me, for being weak.

But also curse the United States.