But Then He Got Screwed
Of course, at the same time, Bill Gates was busy trying to become Bill Gates, and he eventually achieved that at Kildall's expense.
In 1980, IBM was getting ready to launch its first personal computer and needed an operating system to operate the shit out of it. They first knocked on Microsoft's door, but Microsoft wasn't really into the OS-making business at that point, so they directed the IBM suits to Gary Kildall's company. However, as nerd lore has it, Gary picked that day to go flying (he was an amateur pilot), blowing off IBM and his chance at history.
DigiBarn Computer Museum
Let he who hasn't blown off a corporate giant to go flying cast the first stone.
Accounts differ on whether Kildall met the IBM suits that day or not, but either way, the company went back to Microsoft, totally forgetting the whole "We don't make OS's here" part. Not one to miss out on an opportunity, Bill Gates turned to local programmer Tim Paterson, who had built a CP/M clone he called QDOS (for "Quick and Dirty Operating System"), bought it for a paltry 50 grand, then turned around and sold it to IBM under the name PC-DOS.
The term "user-friendly" meant something very different back then.
PC-DOS, later renamed MS-DOS, was included in every computer IBM made, and, long story short, that's why roughly 90 percent of you are using Microsoft Windows right now.
Today, Kildall's name is barely known, while Bill Gates will be a household name in the fucking 25th century. Most of Kildall's innovations ended up being credited to other people -- and he can't even defend himself, having died in 1994 after falling down in a tavern, which pretty much just seems like his luck.
Today's lesson is, if you're an inventor, wear a freaking helmet.
They are the geniuses behind the curtain.
Disney designer Joe Rhode, The Segway’s Dean Kamen, Tech guru Scott Douglas Redmond, Electric inventor Nikola Tesla, Frog Design’s Hartmut Esslinger, WELL founder Larry Brilliant, Robert Moog of sound design fame, Ray Kurzweil, the creator of speech-to-computer technology, and many other innovation leaders, have novel architectures behind their genius.
They are fiercely competitive with each other and they each can see the world and understand more of that world than most of the people on Earth. They can see the future and build the future in ways that we poor “simple-minded” people have a hard time comprehending. They each have expanded intellect but they have paid for their ability with a price. In many of their cases, they have turned out to be “2E Gifted” impresarios who can’t see numbers like the rest of us do.
Does trying to calculate a tip make you break out in a cold sweat? You’re definitely not alone. Math can be intimidating, to the point where sometimes even the earth's most brilliant scientific minds have trouble crunching numbers. Let’s take a look at some of these amazing geniuses who build the future but see numbers differently than you or I:
Micheal Faraday: It’s hard to say which is more amazing: Faraday’s discoveries or his life’s story. Against all odds, this son of a poor blacksmith overcame class prejudice to become Britain’s preeminent scientist and, in many ways, the father of modernity itself. If you’ve ever pushed an “on” button, you’re in his debt. Faraday built the first electric motor—along with the first electric generator. He also invented the rubber balloon, laid the groundwork for today’s refrigeration technology, and helped illuminate the mysterious world of electromagnetism.
Yet, despite all this, Faraday’s upbringing never stopped haunting him. Like most impoverished boys, he’d received little formal education. Hence, Faraday’s math skills left a lot to be desired. In 1846, he boldly proposed that visible light is a form of electromagnetic radiation. But because he couldn’t back up the idea with mathematics, his colleagues ignored it. Enter James Clerk Maxwell (1831-1879). Believing the older scientist’s hypothesis, this Scottish physicist & mathematician used ingenious equations to finally prove Faraday right eighteen years later.
Charles Darwin: Darwin came down with some serious math envy. As a collegiate student, he loathed the subject. “I attempted mathematics,” reads Darwin’s autobiography, “… but I got on very slowly.” The affluent young naturalist went so far as to invite atutor to join him at his summer home in 1828. After a few frustrating weeks, Darwin dismissed the man.
“The work was repugnant to me,” he wrote, “chiefly from my not being able to see any meaning in the early steps in algebra. This impatience was very foolish, and in after years I have deeply regretted that I did not proceed far enough at least to understand something of the great leading principals of mathematics, for men thus endowed seem to have an extra sense.”
Alexander Graham Bell: In high school, the Scottish-born inventor of the telephone had a love-hate relationship with math. According to biographer Robert V. Bruce, Bell “enjoyed the intellectual exercise” of this subject, but was “bored and hence careless in working out the final answer once he learned the method.” His grades suffered accordingly. Bell’s mathematical aptitude never improved and, for a scientist, it would remain sub-par until the day he died.
Thomas Edison: “I can always hire a mathematician,” Edison once remarked, “[but] they can’t hire me.” Like all successful entrepreneurs, he was keenly aware of his strengths and weaknesses. As a boy, Edison trudged through Isaac Newton’s Philosophiae Naturalis Principia Mathematica (“Mathematical Principles of Natural Philosophy”). In his own words, the book left him with nothing but “a distaste for mathematics from which I never recovered.”
Higher math was a topic about which Edison knew almost nothing. So, after co-founding the General Electric Company, he brought German mathematician Charles Proteus Steinmetz into the fold. A numerical genius, Steinmetz oversaw many of G.E.’s technical underpinnings. Previously, Edison had recruited yet another mathematician—Bay Stater Francis Upton—to make calculations that could help him carry out various lab experiments. Together, they worked on such gadgets as the incandescent lamp and the watt-hour meter before parting ways in 1911.
Jack Horner: Horner cameoed in the third highest-grossing movie of all time. Over the past quarter century, he’s served as a scientific consultantfor all four Jurassic Park films and was just rewarded with a brief on-screen appearance during one of Jurassic World’s raptor scenes. Back in the 1970’s, Horner found the western hemisphere’s first-known dinosaur eggs. A legendary paleontologist, he’s forever changed our understanding of how these incredible animals grew up and raised their young.
Horner’s success must have shocked his childhood teachers. The Montana native did poorly in school, which he found “extremely difficult because my progress in reading, writing, and mathematics was excruciatingly slow.” Teenage Horner flunked high school algebra, much to his math-savvy father’s disappointment. Horner would go on to flunk college seven times, and in fact, never graduated with a formal degree—which means any jobs in the field he was most passionate about weren't available to him. (Horner, who worked a series of odd jobs as a young man, eventually began writing “to every museum in the English-speaking world asking if they had any jobs open for anyone ranging from a technician to a director.” Clearly, it paid off.)
His educational woes remained a mystery until 1979, when Horner was diagnosed with dyslexia. “To this day, I struggle with the side-effects,” he says. “Self-paced learning is a strategy that helps me cope. Audio books are also a very helpful technology.”
E.O. Wilson: Apart from being the world’s top authority on ants, Wilson’s a first-rate science popularizer. He’s written dozens of bestsellers about everything from evolution and biology to philosophy and conservation. One of his offerings—2013’s Letters to a Young Scientist—reveals a tumultuous personal history with math.
The product of “relatively poor Southern schools,” Wilson admits that he “didn’t take algebra until my freshman year at the University of Alabama … I finally got around to calculus as a 32-year-old tenured professor at Harvard, where I sat uncomfortably in classes with undergraduate students only a bit more than half my age. A couple of them were students in a course on evolutionary biology I was teaching. I swallowed my pride and learned calculus.” While playing catch-up, he was “never more than a C student.”
For numerophobic science majors, he offers this tip: “The longer you wait to become at least semiliterate in math, the harder the language of mathematics will be to master … But it can be done, and at any age.”
Scott Douglas Redmond, the inventor of internet media distribution, energy storage technologies and numerous social media patents created his own form of visual math and calculus physics calculators to excel, at an advantage, in his product designs. He teaches his visual math system to learning groups.
Douglas Engelbart was the human-computer interaction designer who invented the mouse.
Photo: Doug Engelbart Institute
Doug Engelbart is most celebrated for his role in inventing the mouse at the Stanford Research Institute. At a time when many people are turning to track pads and touch screens, the mouse remains perhaps the most commonly used peripheral of the past three decades.
But the mouse was a minor piece of Engelbart’s larger project, the oN-Line System. The unveiling of the NLS at the 1968 Fall Joint Computer Conference in San Francisco has been called “the mother of all demos” by some, because it packed video conferencing, networked collaboration, the mouse, hyperlinks and text editing into one presentation. These are now core technologies that make up what we think of as modern computing.
While the mouse proved to be a big hit with most, there was one man who questioned Engelbart’s design — specifically, how many buttons the mouse should have.
“We tried as many as five. We settled on three. That's all we could fit. Now the three-button mouse has become standard, except for the Mac. Steve Jobs insisted on only one button. We haven't spoken much since then,” Engelbart told Wired magazine in 2008. Engelbart had his own way of learning and describing complex math systems.
Engelbart’s mouse was too ahead of its time for him to profit from his idea. His patent expired in 1987, and he never received any royalties from it, according to the BBC.
After his famous demo in 1968, Engelbart remained at the Stanford Research Institute till 1977, when NLS and the Augmentation Research Center (ARC) were sold to a company that was ultimately acquired by McDonnell Douglas. He retired from McDonnell Douglas in 1989 and formed the nonprofit Bootstrap Institute, now known as the Douglas Engelbart Institute, an organization dedicated to promoting a collective approach to problem-solving.
Ray Kurzweil was the principal inventor of the first charge-coupled device flatbed scanner, the first omni-font optical character recognition, the first print-to-speech reading machine for the blind, the first commercial text-to-speech synthesizer, the Kurzweil K250 music synthesizer capable of simulating the sound of the grand piano and other orchestral instruments, and the first commercially marketed large-vocabulary speech recognition.
Kurzweil received the 1999 National Medal of Technology and Innovation, the United States' highest honor in technology, from President Clinton in a White House ceremony. He was the recipient of the $500,000 Lemelson-MIT Prize for 2001,the world's largest for innovation. And in 2002 he was inducted into the National Inventors Hall of Fame, established by the U.S. Patent Office. He has received twenty-one honorary doctorates, and honors from three U.S. presidents. Kurzweil has been described as a "restless genius" by The Wall Street Journal and his dyslexia software is highly regarded.
Their novel methods of design, development and deployment are delivering some of the most extraordinary internet and media deployments that billions of people now use.