Mobile Technology

Things Apple killed: Hardware that met the chopping block

View 9 Images
A history of the hardware that Apple has axed
burntime5555/Depositphotos – remixed by Emily Ferron/New Atlas
First-generation iPhone
Carl Berkeley / Wikimedia Commons
Left: Apple I computer with Steve Wozniak's signature. Right: an Apple IIc with monitor
(L) Kazuhisa Otsubo / Wikimedia Commons (R) Rama / Wikimedia Commons
Apple VP of worldwide marketing Phil Schiller introduces the Lightning connector at the iPhone 5 launch event in September 2012
Apple's Lightning-to-AUX headphone jack adapter. Is it enough to make the iPhone relationship work?
A history of the hardware that Apple has axed
burntime5555/Depositphotos – remixed by Emily Ferron/New Atlas
First-generation iPod
The 12-inch MacBook, with lone USB-C port for accessories and charging
Will Shanklin/New Atlas
A later-model iMac G3 with a slot-loading optical drive
Sjur Rasmus Rockwell Djupedal / Wikimedia Commons
View gallery - 9 images

Apple's decision to nix the headphone jack on this year's iPhones has certainly caused a stir. Whether you're a staunch Apple fan or you'd rather use Windows XP than send your cash to Cupertino, let's take a look at all the hardware that Apple's already killed. Then you can decide whether or not removal of the headphone jack is following a tradition of innovation – or something else altogether.

Apple puts down roots

You may already know about the cash-strapped, homespun roots of Apple Computers – the genesis story has worked itself into public knowledge. Its first product, the Apple I computer, was sold from the back pages of a magazine. It was only a careless circuit board, and strictly a hobbyist's machine. Buyers had to connect it to a TV set and keyboard themselves.

Small-time sales were enough to drive the development of the Apple II, a plastic-cased computer closer to the PCs of today. The first Apple II had a built-in keyboard and in some cases connected to a TV display. The original retail price of the computer – in 1977 – was US$1,298. Adjusted for inflation, that's about $5,158 today. Yes, it had a staggering amount of capability for its era, but it was still an expensive device that on its own, solved exactly zero problems.

Left: Apple I computer with Steve Wozniak's signature. Right: an Apple IIc with monitor
(L) Kazuhisa Otsubo / Wikimedia Commons (R) Rama / Wikimedia Commons

How on earth did it sell? Two main factors were 1) its ability to run VisiCalc, the first-of-its-kind spreadsheet software that revolutionized the business world, via floppy disk; and 2) it looked good. Individuals with reasonable technical aptitude could set it up and use it without being intimidated by wires, chips and overwhelming cords. The molded plastic case took care of all that.

In a sense, the first thing that Apple introduced – then took away – was a DIY mentality. To this day, the closed-off quality of Apple machines invokes both hostility and admiration. The closing-off process had roots in the Apple II's plastic case, and the first Macintosh took it even further, with Apple only permitting special service centers to open the computer's case, and with none of the parts or slots being upgradeable, an unheard-of move at the time.

What followed was the birth of modern personal computing, a period of growth and additions. But during this time, Apple underwent a tumultuous period that included failed projects, a temporary ousting of Steve Jobs, and near-decimation at the hands of its Windows-based competitors.

1998: iMac flips off the floppy disk

Fast forward to the late 1990s, with Steve Jobs back at the helm. In 1998, Apple released the first iMac, a candy-colored desktop that helped revive Apple and catapult it into the internet age. It sold well amidst acclaim and controversy, setting the tone for Apple product launches to follow.

A later-model iMac G3 with a slot-loading optical drive
Sjur Rasmus Rockwell Djupedal / Wikimedia Commons

What was missing? The floppy disk drive – the exact technology that allowed the Apple II to bridge the gap between a hobbyist's gadget to a daily use tool – along with mouse, keyboard and printer-specific ports. Instead, there was a CD drive and new USB technology for connecting peripherals.

An outcry erupted. Consumers were not happy about the idea of buying new USB-ready external hardware, while Apple argued that the internet and rewritable CDs were replacing floppy drives as the primary way to move around local files.

But there was another point of contention as well. The first iMac was announced with a 33.6k modem, a lagging speed even in 1998. Many were incredulous – the new iMac was being billed as a screamer, where "i" stood for "internet," yet it was only being offered with subpar connectivity. In a rare quick response to consumer demand, Apple boosted the modem to 56.6k before most of the devices started shipping. It stood firm on the existing ports and drives.

Original iMac sales were through the roof. For several months, it was the number one-selling computer in the United States. It was so well-received that a barrage of third-party USB-ready peripherals hit the market, and everything from external hardware to kitchen accessories were copycatting its colorful, translucent design.

2001: iPod, a case study of simplicity

The iMac's success established the groundwork for much of Apple's modern identity, and by extension, the birth of the iPod in 2001. Several developments differentiated the iPod from its competitors, but for the sake of this article, let's note what the iPod was missing: visible construction elements like screws, a removable battery and an on/off switch.

First-generation iPod

These components, which would have seemed like necessary ingredients in any piece of consumer electronics, were hardly missed. Arguably, their removal bolstered the device's slick appearance, keeping the internal mechanisms out of sight and out of mind.

2007: iPhone bypasses a physical keyboard

If you were a cell phone power user in the pre-iPhone era, you were likely sporting either a snappy flip phone or a PDA, such as the stylus-navigated PalmPilot or the BlackBerry with its pocket-sized keyboard.

First-generation iPhone
Carl Berkeley / Wikimedia Commons

Of these, the first generation iPhone most closely resembled a PDA, yet it was without stylus or keyboard. Instead, it presented a dynamic onscreen keyboard for touch typing. Many consumers found it hard to imagine a pleasant touch typing experience (a notion that is echoed in some responses to the Lenovo Yoga launched last month), but as we now know, this became a mainstay in mobile technology.

The first-generation iPhone also lacked a removable battery and an SD card slot, omissions that are still present in today's iPhones.

2008: MacBook Air nixes the CD/DVD drive

When it was originally launched, the MacBook Air was positioned to be Apple's premium ultrathin laptop. But that super-slim sexiness came at a price: MacBook Air was the first Apple laptop to ditched the optical CD/DVD drive. It had one USB 2.0 port for connecting peripherals plus a headphone jack and micro-DVI port.

Critics acknowledged its good looks but also the utility of that attractiveness. Despite its expensive top-of-the-line placement, it seemed less than fully featured, being the first major laptop to pare down so much hardware. Many Apple products have been criticized as status symbols first, computing devices second, and the first generation MacBook Air received much of this ire.

It's interesting to note that since then, MacBook Air has evolved into one of Apple's entry-level laptops. It's still super svelte, but its price and performance are on the low end of Apple's notebook offerings. Furthermore, Apple has not included a built-in optical drive on any of its new products for a couple of years now.

2012: Lightning strikes the 30-pin charging dock

Few events are as fraught with superlative language as mobile technology launch events. Of these, Apple keynotes seem to have the most sycophants in attendance. But when Apple's marketing VP Phil Schiller announced that the iPhone 5 would use a Lightning charger that made the legacy 30-pin charging dock obsolete, there was a distinctly lukewarm reaction.

Apple VP of worldwide marketing Phil Schiller introduces the Lightning connector at the iPhone 5 launch event in September 2012

As expected, VP Phil lived up to his last name and shilled the phone within an inch of its life. The new Lightning connector, he said, took up less room and allowed the new flagship to offer more features at a smaller size: Weighing only 112 grams and measuring 7.6 millimeters thick, the iPhone 5 was light and thin even by today's standards.

A 30-pin-to-Lightning adapter sold separately for about $29, but it was not compatible with all accessories. Despite the benefits attributed to the switch to Lightning, few were excited about it – except perhaps companies that manufacture iPhone accessories. The original Lightning port controversy has since died down, but the 86ing of the headphone jack has brought this still-recent memory back into mind.

2015: 12-inch MacBook gets rid of all ports except one USB

The 12-inch MacBook is Apple's newest entry-level everyday laptop, but it has only one lonely USB-C port for all your peripherals and charging. For almost everyone, that demands the occasional use of an additional third-party USB hub.

The 12-inch MacBook, with lone USB-C port for accessories and charging
Will Shanklin/New Atlas

If a single port is intended to create a streamlined experience, it's not doing its job if a bulky hub is a constant necessity. The single port is likely a show of Apple's belief in a fully wireless future. Even if the company is right, we're still waiting for the industry to catch up.

2016: iPhone 7 is sans-headphone jack

That brings us to the current controversy: Has Apple gone too far in its removal of the 3.5-mm headphone port from the iPhone 7 and 7 Plus? Should owners of wired headphones consider making the jump to Android? Will the omission be applauded in hindsight, like iMac skipping the floppy disk, or will public demand be quietly accommodated, like the relegation of the MacBook Air into a lower-end model?

Apple's Lightning-to-AUX headphone jack adapter. Is it enough to make the iPhone relationship work?

Wherever your opinion falls, it does seem that Apple fares better when it's liberal with the axe: In the past, hardware removal has spurred industry innovation and generated mass amounts of publicity. What remains to be seen is whether Apple can be "innovative" to a fault – and whether or not the current round of hardware cuts are hollowed-out iterations of a formula, or setting the stage for something much bigger.

View gallery - 9 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
15 comments
Gavin Roe
you forgot the 64bit 060 chip
Rann Xeroxx
Just sounds like an Apple apologist story. Things like floppies and DVDs went away because they were replaced with technologies that were better but putting proprietary ports on iOS devices was just a further enclosing of a enclosed garden and Apple made literally $billions in licensing fees.
And there is no benefit for the customer for a laptop with only one port. It's dongle hell and makes a nice device look cluttered with all those y ports dangling off of it.
Now there is the removal of the 3.5 that benefits the customer in no way whatsoever. Sound has to be converted to analog before transmission to speaker or from mic so you are simply moving the conversion from the sound producing device into a cheap dongle or over complicating a headphone. The problem with the 3.5 was that it could not be monetized by Apple so it had to go. Devices like the LG V20 has all the audio improvements and even more than the iPhone 7 yet retain the 3.5.
tapasmonkey
SCSI to Serial - that p*ssed me off so much I abandoned Apple back in 2002. They have been consistently pulling this trick every few years: sometimes they're right, but half the time they're wrong - what is certain is that it makes a lot of Apple products very expensive landfill way before their time.
Readout Noise
The one Apple product which runs counter to their "enclosed garden, almost no ports" philosophy is the Mac Mini. Basically a small box with a powerful motherboard and loads of ports.
It's the only Apple computer I have bought because it keyed me in to OSX applications I needed (actually for the most part, legacy Unix ones), at minimal hardware cost. I use it with all the Wintel PC peripherals I already had (2 monitors, wireless keyboard & mouse, external HDDs & DVD burner, external speaker system).
Bob Stuart
I've even lived in Apple-sponsored work spaces, and always wanted a Mac. When I could finally afford one, I got a new iMac. After a couple of disastrous calls to Mac Help under warranty, I vowed to never send them another dime, and I have only gained resolve as the hardware failed. The world's most profitable company sells more illusion and less substance than the rest.
Aloysius
I was telling my 10-year old niece the story of the original iMac, and she said, "What's a floppy disk?" 8-0
HowieBabe
This article reminded me of the story of the blind men describing an elephant. Each was correct, given their limited knowledge, but none provided an accurate description.
I was a member of the Homebrew Computer Club and was there the night Steve Wozniak and his groupie, Jobs announced the Apple ][. It was a thing of beauty, in a fiberglass (not plastic) case. The electronic design was a work of art.
Woz pulled off a tour de force that still boggles my mind. Still working out of his garage, he convinced Al Shugart, the owner of the largest floppy disk manufacturer to modify their design especially for Apple, a change that allowed Woz to reduce the cost of the disk interface by 75%. It had a side effect of making the drive sole source from Apple.
The article also states that the Apple ][ could do nothing on its own, which baffles me. I made a good living embedding Apple ][ computers into machinery and systems of all types.
Woz developed the first open source bus system that actually worked and allowed the Apple ][ to be supported by third party developers.
The developer of the IBM PC recognized this key trait and made it a key feature of the PC. After IBM got rid of him and tried to close the architecture , they nearly lost the entire market.
After Apple kicked Woz to the curb, Jobs nearly killed Apple by closing the architecture with the MAcintosh. He went on to establish Next Computer, which was famous for refusing to sell to anyone other than a university. It failed.
The history of computing is littered with the remains of large, powerful companies that bit the dust by treating their customer base as dummies. Ask Texas Instruments. They wound up writing off $650 million following their silly policy of suing anyone who wrote software for their fine computer.
There is so much more to this story
Lbrewer42
What kept Apple going was their products, compared to Windows, were light years ahead in user-friendliness. People like having usable power at their fingertips that does not require them to spend more time working on the computer to get it to do what they want done instead of working on the actual projects.
Now that Steve Jobs is gone, Apple is starting to become more and more like the Windows operating system. They continue to pull away useable features and make it so you have to do many more steps than what was previously needed.
Unless they stop this trend, their hacking away at hardware will lead them to failure. Their reliability and usability have been what carried their cuts that would have ruined other companies and turned the market to make what was possible instead of continuing to milk the public for outdated tech.
ljaques
I had always thought Apple to be a rude company, but I didn't realize just how rude until this article. No wonder I've always been a PC user/tech/progger.
Matt Fletcher
Anyone remember the Ipod Nano touch screen 6th gen. This devise could be attached to a wrist band and be worn as a watch and synch to your Iphone. Then they just disappeared at a time when smartwatches were just starting to appear, a good 2 years before the Iwatch. Apple could have marketed this as their new IWatch and beat everyone to the market but screwed that up as well.