25+ Inventions That Changed the World

The secret stories of inventions that changed the world.
Odin Odin (181)
0

A great invention is basically like inventing a solution for a really difficult puzzle. And human beings, like crows, love a good puzzle. When we see something in the world that we don’t understand we try to figure it out. When we come up against challenges, we search for the clues to their dissolution, probing ever-deeper in our relentless search to understand and explore. And those answers ending up being some of the greatest inventions of all time.

I’ve always been fascinated with the history of inventions and inventing, so this list was a chance for me to dive back into some of my long-time favorites, as well as a few new mysteries that I’d long been wanting to explore.

If you think these inventions are interesting, check out the 16 most unbelievable, historical military facts.

Posted in these interests:
h/technology89 guides
Nails

It’s hard to imagine something more mundane than a nail, right? It’s a basic tool, one of the most fundamental parts of the building process. It’s entered our societal consciousness and even our language to such a degree that it’s hardly something anyone could be expected to think about. However, the nail has a great history, and its invention was a game-changer for homo sapiens.

Nailing down the timeline

Archaeologists have found nails that date back to around 3,400 B.C.E. though it's likely that there were nails around a long time before even that. Those early nails were bronze creations, a light alloy of copper and tin used by ancient civilizations.

Professionals made nails for thousands of years. Metalworkers would pass on slender metal shafts to these professional nailers, who would hammer them into their final shapes.

The 19th century saw the invention of the slitting mill, however, a type of watermill that used hydro-power to automate slicing the iron into the nail's initial shape. Nailers were then relegated to the nail's final shaping, using a hammer to create the point and head. Eventually, even that artistry vanished, replaced by the fully mechanized industry we know today.

Process
Susanna Celeste Castelli, DensityDesign Research Lab

Taking photos has become a ubiquitous part of modern life — it’s everywhere ad accessible to anyone with even a low-end smartphone. But, time was when the exposure of a single photo (creating the photo by allowing light to strike a special film) could take many hours.

Photography made it easy to flash back in time

image

The first successful form of photography was known as daguerreotyping, invented in the 1830s by Louis-Jacques-Mandé Daguerre (from whose name the term was derived) and Nicéphore Niépce.

By exposing a copper plate coated with silver iodide (a highly photosensitive substance) to light through a camera and then utilizing a vaporized concoction of salt and mercury, an image imprinted on the plate would be made permanent. Though other photography methods later supplanted this technique, it remains the beginning of everything visual we take for granted today.

tube radio
A Toshiba vacuum tube radio from 1955.
Wikimedia

In 1904 the famous physicist and electrical engineer finished work on an idea that would change history: the vacuum tube. Nowadays, it’s unlikely that you’ll find this outside of an antique, though it is still possible to find them in use within some high-end audio equipment.

Totally tubular!

image

The vacuum tube is exactly what it sounds like: a tube of glass with all of the gas within suctioned out and a seal formed (hence the creation of a vacuum). Inside the tube are electrodes, which control the flow of electrons.

Before the creation of the vacuum tube, computing devices needed to rely on mechanical switching, which severely limited the speed by which a computer could function. The vacuum tube allowed on/off commands to be sent purely by the flow of electrons, speeding up the process and allowing more to be done with less space.

However, vacuum tubes are large and delicate compared with transistors, which eventually began to replace them in the 1950s as computer technology strove toward far smaller versions of what had been possible before.

Sax quartet

Not that many instruments in history have been invented by a single individual, but the saxophone is one of the best-known and best-loved in use today.

A “sax”cessful development

image

Adolphe Sax (1814-1894) conceived the saxophone sometime in the 1840s, and it is from him that this beloved instrument gets its name. Sax was a professional designer of musical instruments living in Belgium who possessed a great knowledge of wind instruments and could play a wide variety of them. Enraptured by the sounds of woodwind instruments as well as the somber power of brass, he sought to combine the two, and the Saxophone was born.

Further evolution of the saxophone took place through the 1930s and into the late 1940s, with changes to the base design or entirely new layouts being tried.

iconoscope1
Wikimedia

Vladimir Zworykin is known by his nickname “father of television,” and for good reason. This Russian-born American inventor created the first fully electronic scanning device (the iconoscope) and fully electric receiver (kinescope).

Scoping out the future of TV

image
Wikimedia

The iconoscope contained a photosensitive plate featuring several tiny sections called pixels, which, together, would create a single image. A beam of electrons fired at this plate would charge the pixels, based on how brightly focused light was on each of them. This allowed for a direct derivation of an original image.

Zworykin never liked his “father of television” moniker and never accepted it, stating that he shared the credit for creating television technology with a host of others. Regardless, his work helped to pioneer the process of sending moving pictures across the world in an accessible way and led to some of the greatest social changes in history. Best of all, though he was born in 1889, Zworykin lived until 1982, so he would have been able to see all the ripening fruits of his labor, quite literally, with his own eyes.

IC

The evolution of computing could be summed up as the ability to do more things with less space, and in this regard, the IC, or Integrated Circuit, is one of the greatest inventions of them all. It’s something every schoolchild should learn about because it’s the single most fundamental piece of any modern technology underlying just about everything they’re going to interact with for the rest of their lives.

Chips off the old block

image

An IC is an assemblage of miniaturized electronic components all fabricated as a single unit. Everything is built on a sliver-thin semiconductor (a crystalline plate with properties between electrical conduction and insulation) so that a single IC contains all the transistors, diodes, capacitors, and resistors (and the pathways between them) built-in (and, in modern IC technology, at a microscopic level).

They replaced vacuum tubes quickly. Passing electrical signals through the crystalline semiconductor creates the same effect as a vacuum tube but at great space savings and increased durability.

Not the product of any one person, the underlying principles were first discovered by Bell Laboratory’s William B. Shockley and his team in 1947, but their primary design would later be enhanced by distinct researchers in 1958 (Jack Kilby and Robert Noyce) who came up with the first methods for further reducing circuit size.

Velcro hooks and loops being pulled apart

The relationship of homo sapiens to its environment is utterly fundamental; humanity’s long tradition of drawing out fascinating inventions from interactions with our local environment is one of our most defining traits.

Such it was one morning in 1941 when a Swiss man named Georges de Mestral noticed seed burrs from the burdock plant stuck to his dog’s coat. Curious and already possessed by the inventor’s itch, de Mestral inspected the burr tips beneath a microscope. Instead of seeing straight spines, as appeared to be the case from the naked eye, he observed tiny hooks that allowed the burr to catch and hold onto whatever passed with incredible strength.

Velcro hooked Mestral’s imagination

image
Wikimedia

It took many more years of innovation, research, trial, and error before de Mestral finally settled upon the design that would become a basic staple of modern life. Once he settled on synthetic fabric (finding cotton too easily damaged for repeated joining and separating), de Mestral also designed a special loom that could create the hook and loop fabric he needed. After more than a decade, he finally finished the prototype design and patented his work. By 1957 his first plant began mass-producing Velcro.

While it ventured into the world as a fashion aid and eventually a replacement for shoelaces, the incredible possibilities opened up by this new technology soon brought it everywhere. Now, daily life is hardly complete without the stuff: it can be found extensively in the medical industry, in military gear, and in the gear worn by NASA Astronauts (who helped to pioneer its use after realizing how much easier and more effective it proved to be than ties). Quite impressively, Velcro was even used in an early artificial heart design to hold the device's left and right sides together.

play-doh

Accidental inventions are a hallmark of humanity, so it’s little wonder that one of the most beloved childhood toys found its way into this world on the back of misfortune and surprise.

Play-Doh-ing better than expected

image

By the first third of the 20th century, the company Kutol Products swept competition aside in its bid to be the world’s largest producer of… wallpaper cleaning solutions. Back in those days, many households still used coal to directly provide heat, and the sooty byproduct made household cleaning a nightmare. But, as cleaner energy technologies and infrastructure took hold, the problem dissipated, and the need to keep your wallpaper free of soot gradually began to slip away. A better world for everyone! Everyone but Kutol Products, that is.

With sales declining, a miraculous thing occurred: Kay Zufall, the sister-in-law of a Kutol Products employee, called him up and told him about the many uses of modeling clay and urged him to look into that arena of play. What followed next is ancient history. The non-toxic formula proved an incredible worldwide hit, remaining a ubiquitous aspect of childhood to this very day.

Structured Data on Commons data modelling

In 1968, 3M employee Spencer Silver was tasked with the creation of a new type of adhesive, something stronger than what was currently available. His invention of sticky “microspheres” proved ineffectual, however, and the project was largely forgotten, with potential applications running into problems either with the strength of the adhesive or the resulting build-up of debris on the sticky material over time.

An invention that stuck

image

It was not until 1974, another 3M employee named Art Fry, realized that the sticky microsphere technology would be perfect for attaching a bookmark to the delicate pages of his hymnal book safely, without leaving any damage after being removed. What followed was a notable revolution in note-taking. After a powerful marketing campaign that created the self-identifying name “Post-It Note”, the new invention stuck for good.

Since their invention, Post-Its have been everywhere, from the business and academic worlds to the forefront of social change (as seen with The Lennon Wall during the 2014 Hong Kong protests).

Interesting note: An American inventor named Alan Amron has claimed to be the original inventor of the Post-It Note concept, and has fought successful battles against 3M regarding his rights to it due to his having presented on it in 1974. Regardless, Silver and Fry remain at the core of the story for the original work and experimental approach.

Pi-1

In 2006 Eben Upton connected with several educators, computer enthusiasts, and academics in the hope of creating a new type of computer, one which could be used to inspire children to better understand and explore computer technology. Using inspiration from the 1981 BBC Micro educational computer, the first designs were laid out and detailed planning began. It would take another five years for the project to move into early production, but by 2011 the earliest models were being assembled and tested.

A pint-sized invention everyone is pining for

image

Originally conceived of as an educational aid, the Raspberry Pi has since taken off to incredible new heights, finding use in everything from retro gaming applications to high-end photography. Commercial applications have also made use of this tiny and powerful computer system. Having access to a low-cost, affordable-to-operate and surprisingly powerful personal computer is groundbreaking in terms of opening the doorway to Internet access for disenfranchised people around the globe. But, the Pi’s excellence goes even further: in 2021 Raspberry Pi computers became a major tool in the fight against COVID-19, with the Raspberry Pi Zero finding new use in ventilator systems in countries where the health infrastructure was being pushed to its breaking point. No matter where you look, the Raspberry Pi is more than just the best-selling British computer ever, it’s also one of the biggest computing game-changers in the history of personal computing technology.

ipods
Wikimedia

The history of the iPod has to start with its name. After all, the “i” prefix has become nearly eponymous with the tech industry, surely it has to mean something, right? Nope. Apple’s had a long history with the “i” prefix being used for designs, with the first release to bear it being the 1998 iMac. According to the team leader who came up with the name for iMac, Ken Segall, it simply stood for “internet” as a means of showcasing Mac’s ability to connect to that latest and sometimes intimidating world of 1990s networking.

ipso-facto: I see “i” everywhere

image

The “Pod” part of the name iPod comes from 2001: A Space Odyssey, which the copywriter Vinnie Chieco came up with when hired by Apple to work on branding for some of their latest technology. He was thinking about stylistic design choices and remembered the famous line from 2001 “Open the pod bay doors, Hal.” Since the first iPods were even released in 2001, it sort of fits perfectly.

But what about the iPod itself? There were plenty of other MP3 player designs around before the iPod was conceived but none of them could match its design qualities or simple interface. At the time of the iPod’s initial release, Apple was a known but not supergiant company, but the success of this sleek MP3 player helped to drive an entire industry of Apple products, starting with the iPod and accompanying music-storage software, and eventually spinning off into greater and greater leaps of design ingenuity. While everyone knows Apple now, it’s fascinating to remember that the company largely became a powerhouse due to this one product way back in the ancient year of 2001.

cookies1

It’s hard to believe it, but chocolate chip cookies were not always a thing.

I “dough”n’t even want to imagine a world without these

The invention is surprisingly modern, with its roots dating back to a woman named Ruth Graves Wakefield in 1938. Wakefield owned the popular Toll House Inn and decided to try something different to mix things up a bit for her guests. Instead of going for her usual staple of butterscotch nut cookies, she chopped up bits of Nestle semi-sweet chocolate bar into her cookies and ended up with a world-altering phenomenon.

The recipe spread far and wide during WWII when soldiers from Massachusetts who received care packages from home containing the cookies shared them, and soon soldiers from all over the country were asking families to send them their own chocolate chip cookies. Wakefield would go on to give the recipe for her invention to Nestle, who rewarded her with a lifetime supply of chocolate.

cannon
By Hendrik Willem Van Loon -

Way back in the year 142 AD, the father of alchemy commented on a mixture of three powders that would “fly and dance” with great violence. And yet the earliest attempts to probe the mysteries of this concoction were likely attempts at discovering an elixir capable of prolonging life or granting immortality. Sadly, the eventual result would prove far more adept at taking life rather than giving it.

The mix sparked the imaginations of these early experimenters

image

Over the next several hundred years, the formulas and reasons for exploring gunpowder continued to evolve, with a basis in attempts at healing and creating gold. But the earliest known formula for gunpowder dates to 1044 in a Chinese military manual, which described several weapons made using the substance and therefore indicates that it had been in use as a weapon for some time — if primarily as an exotic and uncommon one. In fact, in its earliest incarnations gunpowder was often not explosive, merely highly flammable, leading to its use as “fire arrows” which were simply arrows set alight and designed to spread maximal inflammation.

By the late 1100s, gunpowder explosives were being used to greater effect within iron casings, creating an iron bomb capable of terrible destructive potential. Further developments during the subsequent two centuries took the melding of metalwork and gunpowder to new levels, with the invention of hand cannons capable of delivering shot (the first true “bullets”). Indeed, such weapons would be used by the Mongols in the 1200s during their invasion and conquest of China.

The spread of gunpowder would continue for the next several hundred years, first through the Middle East and then into Europe where it was primed to usher in a revolution in military technology.

kevlar

In 1965 the chemist Stephanie Kwolek was attempting to develop a new type of strong and rigid fiber that could replace steel wires that were, at the time, commonly used to reinforce car tires.

Kevlar: The strand in demand

image

An unexpected reaction during one of her many routine experiments resulted in a new type of fiber, something stronger and stiffer than anything anyone had ever seen before. Kevlar.

Kevlar is formed of long chains of strong hydrogen bonds, with the molecules arranged end-to-end throughout the fiber. This results in something fives times strong than its weight in steel while remaining exceptionally light, and capable of withstanding extremely high temperatures.

Kevlar has since become renowned for its ability to save lives due to its use in so-called “bulletproof vests” but its uses have reached farther still, finding ground within NASA technology for space exploration, and firefighting equipment that keeps firefighters safe within blazing conditions. Another great example of how accidents have changed the world.

phono

In 1877, Thomas Edison was experimenting with a method for recording telegraph messages when he realized that by imprinting sound vibrations from the air through the use of a diaphragm he could cause a mechanism connected to a stylus to make indentations on a soft material, generally tinfoil. The process could then simply be reversed to reproduce the sound just acquired — only the vibrations of the stylus running through the new grooves were now transmitted back out through the diaphragm as audible sound.

A sound invention

image

These original phonographs were not suitable for extended use, however, due to the limitations of the tinfoil medium inherent in their construction. The foil would be destroyed through subsequent playbacks and the mounting process was difficult. The design itself, a cylinder of various materials (wax at first, which proved a more durable alternative to tinfoil) on which the sounds were inscribed, remained in use until the mid 20th century. The switch to discs from cylinders actually had nothing to do with sound quality (in fact, the sound on a cylinder remains constant across the whole playback, while sound on a disc is distorted between the inner and outer portions due to a difference in velocity. The takeover by the disc was due entirely to the greater ease by which it could be mass-produced using a stamping technique, whereas the casting required for the cylinder design proved more costly and time-consuming.

These days, the “grooves in tin foil” are reproduced digitally and almost anyone has access to millions of songs at the push of a button, but there was a time not that long ago when the reproduction of sound was at best a gimmick for the wealthy, at worst an impossibility altogether.

glasses

Don McPherson started out specializing in designing glasses for doctors undertaking laser surgery that could block out the harmful light and help distinguish the difference between blood and tissue. Incredible and stylish, some of these glasses even started vanishing from the operating rooms. An incredible enough invention on their own, they got their boost in a whole new direction quite by accident.

A “cone”venient encounter

image

A friend of McPherson borrowed his glasses one day and found that he could suddenly see colors that had previously been invisible to him. Based on that, McPherson realized that the rare earth iron embedded in the glass of the glasses was somehow affecting the way his friend's eyes perceived light.

There are three photopigments in the eye, commonly referred to as “cones”, one sensitive to blue, one to green, and one to red. The green and red photopigments overlap to some degree in most people, but a problem can emerge when they overlap too much and photons of light entering the eye aren’t landing on the right color pigment. For most people who have colorblindness, this is the source of their problem.

Years of research and design later, McPherson’s company EnChroma now specializes in bringing color to the colorblind, able to treat around 80% of all cases that come in through their door.

whirlwind oven

As early as 1914 the first designs for this marvelous piece of cooking technology already existed, but it would take a savvy and eclectic inventor and the start of a second world war to see them rise in popularity. Still, once they did, convection oven technology quickly became a staple for chefs and bakers everywhere.

An oven with a fan? What a load of hot air.

image

It was a man named William Maxson who, upon realizing that soldiers on their way to Europe were eating cold sandwiches and field rations, pioneered the idea of a small convection oven that could heat cook complete meals in record time. Via the use of a fan, a browning effect is created through the passage of hot air and different chemical reactions in the surface of the food, while the heat is simultaneously distributed equally around the whole item being cooked (rather than from a single angle). This allowed multiple meals to be cooked on racks, too, without switching them, because the hot hair was carried everywhere throughout the oven.

Maxson named this device the Maxson Whirlwind and was set to make a big break into the post-war civilian aviation industry, as well as the home appliance industry when he tragically passed away. Though he left behind many fascinating inventions, from a military gun mount to an early version of a mechanical autopilot, it would be his convection oven technology that ultimately survived the test of time, normalizing a technology that most these days take for granted. Other companies took up the task of designing convection ovens, and in 2008 Phillips introduced what they called an “AirFryer” that continued in Maxsom’s footsteps as the next iteration of portable and swift home cooking.

oq1

The human immunodeficiency virus, or HIV, is one of the most devastating diseases to ever strike humanity and resulted in the deaths of an incredible number of people, many of whom were from historically disenfranchised communities, especially within the LGBTQ+ community. Though new methods of treating HIV have been developed, no cure or vaccine has yet succeeded in eradicating it as a concern for millions of people worldwide. Worse, if left undetected, HIV can continue to damage the immune system and lead to AIDS (acquired immunodeficiency syndrome).

:OraQuick: Saving lives is no joke

image

It makes sense, then, that the development of a method of quick, easy, and confidential testing would be a dramatically valuable tool in the fight against this disease. Being able to test for HIV with an over-the-counter purchase rather than a visit to the doctor’s office could, for many, be the difference between getting tested or not at all.

There are two such products currently available, the HIV-1 Test System and OraQuick. The former requires a blood sample to be self-taken and then sent to a laboratory, which will then respond with confidential results within a week. OraQuick measures the HIV antibodies present in oral fluid (not saliva, this is fluid in the gums) and can produce its result within 40 minutes. It still requires follow-up with a doctor because of the severity of the condition and the potential of a misdiagnosis from the test, but it offers a chance for people to get a head start on fighting the disease.

Vaccines

Considering the unprecedented situation that COVID-19 placed the world in, finding a vaccine has been crucial on all fronts. While we’ve had vaccines for other respiratory viruses, like influenza, the techniques that are being implemented for the next wave of COVID-19 vaccines are “next-gen” in design, capable of more specific targeting and adaptability to keep up with viral mutations. The existing vaccines and the new ones still being developed are truly a marvelous example of the technological ingenuity that arises from a global scientific community coming together to work toward a singular common cause.

There have been many vaccines developed and put into testing as of early 2021, with several being cleared for wide public use and capable of stemming the tide of further spread.

RNA vaccines

Image

RNA vaccines are amazing because they contain RNA molecules that help teach the body how to fight off infections. These were the first ave of vaccines to be authorized for use in the US.

Adenovirus vector vaccines

Image

Inactivated virus vaccines

Image

Subunit vaccines

Image

Regardless of the type of vaccine, the technology behind them is overwhelmingly brilliant. The lives that will be saved through this technology are certain to change the world for generations to come.

barcodes

It’s a point of incredible curiosity how similar inventions can be created by different people who have never met. It’s incredible how different approaches to the same problem build upon each other to create some better and more unique than any of the individual parts. Both of these processes can be found at work in the history of the barcode.

It’s hard to codify the impact of this one

image

It’s such a common part of modern life that it’s easy to forget that the bar code only took off in the 1980s. But the whole thing started when an inventor named Joe Woodland sketched Morse Code on the sandy shore of a beach and realized that by extending the dots and dashes could be extended into wide or narrow lines. That, he thought, might be read by a machine.

The original search for this came from a desperate supermarket manager looking for a better way to improve store efficiency. Bernard Silver, a friend of Woodland, told him about this, and their search for a perfect system was born. Of course, this was back in the late 1940s, and since they lacked a sufficiently powerful computer, as well as a bright light small enough for their purposes, the project floundered. They filed their patent and went about their lives.

Then, in the late 1960s, the supermarket chain Kroger and the then-tech powerhouse RCA teamed up in search of something that could simplify the checkout process. Hitting upon the old patent by Woodland and Silver and the future was born.

In the end, it was IBM, not RCA, that managed to dominate the way we come to view barcodes. RCA had a functioning circular barcode, but IBM came into the race at the last moment and managed to convince more stores to use their rectangular one, a fresh design. It took another couple of decades for the technology to be popularized, but it eventually became one of nearly infinite uses and potential.

Brandy
Canva

Medieval French medicinal and American entrepreneurial dream, the history of brandy is fascinating. Known to the French as “l’eau de vie” (the water of life) it was originally a powerful curative and instrument of sanitation. Then, in the late 1790s, it took off throughout America thanks to the entrepreneurial spirit of one American Founder: George Washington who, at the urging of his experienced Scottish farm manager, went into the booze-making business.

Spiritus Sanctus

image

Originally, wine was distilled to preserve it for long travel, but it had other curious effects as well. With the water content removed, various aromatic elements remained behind in the resulting distillation, providing a unique sensory experience. Later, the additional technique of storing the distilled wine in caskets was found to provide even more flavor, resulting in a drink that bore a wonderfully wide range of taste and smell. Further, due to the process of distillation, brandy tends to carry a truly regional element, with its taste dramatically changing based on the location of where its grapes were grown, the water and mineral content, and other local features.

It’s hard to imagine a world without brandy, which would also be a world without a whole range of delicious recipes, from glazes to onion soup. Another gift from an accidental byproduct of a different process, Brandy’s was a medicinal life-saver for hundreds of years before it became a party-saver for any occasion.

Internet

During my undergraduate degree, I asked Professor Noam Chomsky if he thought that the Internet might prove to be an inherently anarchist community, to which he replied with the following:

“It was designed that way and functioned that way as long as it was within the public sector (first Pentagon, then National Science Foundation) (much of it in the lab at MIT where I was working). Since it was privatized 20 years ago, that has been less and less true, in many ways.”

Whatever the temporally-local political implications of the Internet, however, there can be absolutely no doubt that its creation is perhaps the greatest leap forward in the history of human communication. In a way, the history of the Internet is the history of human communications, a steady and yet convoluted march toward greater degrees of communication and interconnection across the millennial. As the historian Yuval Noah Harari points out in his groundbreaking book Sapiens: A Brief History of Humankind, it is our capacity to imagine things that aren’t real and share those things with our fellows that allowed us to expand in ways that other varieties of human were apparently less-capable at (as evidenced by their eventual destruction at our hands).

Everything from verbalized speech, to basic notation, to advanced writing and the proliferation of language through mass printing… every piece of the puzzle of human communication comes together within the Internet as a massively powerful culminating entity. As Chomsky pointed out, this is not an invention free of co-option by powerful selfish interests, but I still maintain that the wonders it opens to us still have the potential to burn bright enough to outshine the dark ends to which it has been used. I think time will tell in which direction the house of cards falls.

A network of inventions

Photo
Wikimedia

As I've already suggested, the Internet is not the product of a single mind, but its true beginnings were quite specific and tied, sadly, to the Cold War.

After the Soviet Union beat America to space with Sputnik, the United States finally started emphasizing the need for science education (something largely ignored beforehand)… it was out of this surge of concern over Soviet ingenuity that drove the creation of the National Aeronautics and Space Administration (NASA) and the Department of Defense’s Advanced Research Projects Agency (ARPA). The latter of these, ARPA was especially concerned with the weakness of the U.S. communications grid to a Soviet attack. A means of communicating even should that grid be destroyed became the main point of its research efforts.

Galactic network

M.I.T. proved the main field within which thinking about this project would take place, with the initial concept described as a “galactic network” of computers capable of “talking” to one another. The first step proved to be the creation of “packet switching” a technique of breaking information down into blocks which could all be sent separately through different connections and ultimately reassembled. This meant that no single line was required between computers — they could sand data through a vast network.

The first computers attached to the network were house-sized, but through the 1970s, technology’s advance began to gather speed. Soon UCLA, Stanford, and the University of Hawaii were all connected within ARPAnet, then the University College in London and the Royal Radar Establishment in Norway. But problems still existed; the increased number of local networks made it increasingly difficult to connect them all into the “galactic network” earlier envisioned at M.I.T. That’s where Vinton Carf came in.

A computer scientist, Vinton Carf developed a way for all those global-local networks to communicate with each other, something called TCP/IP (”Transmission Control Protocol/Internet Protocol”). This is how all computers and computer networks recognize one another and communicate.

A protocol for everything

photo

With TCP/IP solving the longstanding issue of communication beyond local networks, the Internet was born. This fledgling network connected scientists worldwide and allowed information to be sent from one computer to another anywhere. An incredible step forward, but it wouldn’t be until 1991 that the “World Wide Web,” as we know it today, would be born.

Thanks to Tim Berners-Lee, an English computer programmer, abstracted out of the Internet's pre-existing elements, the first union that would allow for more than mere passive exploration: it would be a linked database that anyone could connect to, with instructions for how to create new servers and build websites. In fact, that original site is still hosted on CERN (where Berners-Lee worked at the time).

You can view it here.

This was the World Wide Web, a name that Berners-Lee and Robert Cailliau (another computer scientist instrumental in early Internet efforts) agreed on together. From there, it was a steady expansion and process of normalization for the “Web” to become a dominant force in our lives — quite literally in my lifetime: the first non-European Web-server was installed in Palo Alto in December 1991, the same year I was born.

The future of the net

photo

Since then, the Internet and the World Wide Web have gone through many major advances and revisions, and will undoubtedly continue to do so in the coming decades. New methods of processing, storing, and sharing data are being explored. New infrastructure for Internet connectivity is being implemented everywhere, via cellular services, fiber-optic networks, and satellite systems like Elon Musk’s StarLink network.

While it’s difficult to say the long-term social and philosophical implications of this incredible interconnected system yet, there can be little doubt that it’s one of the greatest inventions in history and proof of the power of collaborative effort applied to a common cause.

Nuclear Stacks

Inventions don’t take place in a vacuum; the history of inventions is also the history of human connection and interrelationship, the history of evolving ideas built one atop the other, inexorably linked. Atomic power is no different. The route to the sort of atomic power we are capable of today considering is buried in the history of the study of elements, radiation, and proof of the existence of atoms. It’s a fascinating road that has bordered a deep gulf of potential self-destruction, as well as a fertile field of innovation and nearly-limitless energy provender. It’s a story that continues today, with the theoretical next steps in the evolution of atomic power offering immense rewards and terrifying risks that must be weighed by a scientifically literate world populace capable of understanding the history that has led us to this point.

The evolution of nuclear power

Image
novaelements

Martin Heinrich Klaproth, a famed British Chemist responsible for the discovery of many different elements, discovered the element uranium (which he named after the newly-discovered planet Uranus) in 1789, the same year that the Constitution of the United States took effect. Uranium is a chemical element that gives off tiny particles or radiation. Radiation was itself discovered a hundred years after uranium, in 1895, when a man named Wilhelm Rontgen passed an electrical current through a vacuum and produced X-rays. A year later, the physicist Henri Becquerel, using photographic plates, discovered the first evidence of radioactivity — a term later coined by Marie Curie, whose pioneering work on radiation led to her being the first woman to receive the Nobel Prize.

All of this background built to a major slew of explorations into radiation and atoms during the early decades of the 1900s, where several scientists discovered that the internal structure of an atom could change when it was bombarded by radiation. Through this process, new elements could be formed. Otto Hahn, a German scientist, discovered that it was even possible to split an atom into two smaller atoms by bombarding it with tiny particles called neutrons. Because neither of the two new atoms could hold as much energy on their own as the original, a lot of energy was released during this splitting (fission just means “splitting into parts”) process. Moreover, when this fission occurred, more neutrons were released from inside the original atom along with the burst of energy, and these new neutrons could cause a chain reaction in other nearby atoms, causing them to split and release energy as well.

A reaction to war

The expansion of World War II further pushed the development of nuclear understanding. Russian, Germany, Britain, and the United States were all rushing forward with programs aimed at turning this new source of massive energy into a better method for causing terrible destruction. A British special committee would go on to release two reports on nuclear options, one regarding the use of nuclear technology in bomb form, and one for use as a new source of power. The first would go on to drive the creation of powerful atomic weapons, like those used to destroy Japanese cities near the end of the second world war. The other report, however, explored another possibility.

Initially the main focus of United States scientific efforts after the British report on the subject, nuclear power was pushed to the back burner after the country became involved in the war. But the practical possibilities of nuclear fission remained, that by controlling the process whereby uranium (or some other fissionable element) fission took place, massive amounts of heat could be created and siphoned off to power any quantity of machines.

The need for nuclear power reached a boiling point

After the war, interest was renewed in the use of nuclear fission for power generation, which had now been proved fully capable of generating both steam from heat and electricity. Renewed efforts at nuclear power research expanded quickly in both the United States and Russia, independently. In 1954 the USS Nautilus, the first-ever nuclear-powered submarine, was launched, followed closely in 1959 by the first nuclear-powered surface vessels of both US and Russian design. The first commercial nuclear plant went into operation in the United States in 1960.

The future of nuclear power

Image
MIT

Nuclear power developments have stagnated since the 1980s, with few new reactors built around the world (and most of those built, or planned, residing in China). Increased concern over the dangers of reactor meltdowns, weaponization, harmful byproducts, and the inherently risky proposition of allowing private commercial entities to regulate such tremendous power themselves, led to widespread public distaste for the fission reactor. Despite figures like Bill Gates championing nuclear power options, the prospect of switching to new nuclear options remains in an uncertain area of public debate.

However, the next great push forward in nuclear development may be just around the corner in the form of the long-sought nuclear fusion (where power is generated from fusing atoms rather than splitting them apart). Should nuclear fusion be solved, a new era of safer, cleaner, and utterly renewable energy could finally be within our grasp.

Cotton Gin

The evolution of the so-called “Capitalocene” (age of capitalism) is utterly linked to the advancement of mass-manufacturing processes. Of the early inventions that so totally transformed a production industry, few could be more impactive to the United States and the world than that of the cotton gin (which is a shortened form of “cotton engine”).

Cottoning on to the need for mass production

Image
Wikimedia

Handheld cotton gins had actually been in use in India since 500 AD, with advances made throughout the middle ages. But a mechanical cotton gin could provide results far faster and with much greater productivity than handheld designs.

During the mid-1800s, the concept of the Indian cotton gin was introduced to the United States, where it was primarily used for cleaning long-staple cotton (the higher-quality cotton plant most commonly grown today). But Georgia and other inland States had a problem: they could only grow short-staple cotton, which contained more seeds and was extremely difficult to process. Advancements to the cotton gin design were made, but these were exclusively for the processing of long-staple cotton… until a man named Eli Whitney invented something new.

The mechanical cotton gin that Whitney devised could reliably separate the cotton seeds from its fibrous lint and do so with only minor technical skill. Before the cotton gin, it would take one person up to ten hours to remove seeds from just one pound of cotton. With the introduction of this mechanical invention, three people working such a machine could clean fifty pounds of cotton in a day. Those people were, invariably, slaves.

The human cost

As cotton production in the southern United States grew, its need for slave labor grew. It was on the backs of enslaved Americans, mostly those of African descent, that the southern plantations gained their wealth and power. But from this industry a massive economic surge occurred throughout the whole country as well, fueling the textile industry in the northern states (where the North’s own brand of brutal working conditions thrived in great factories that often employed women as low-wage workers). Without the cotton gin as devised by Whitney, this surge of slavery might not have been possible — such is the dangerous power of a single invention unleashed upon the world.

Early train
Pintrest

The first locomotive powered by steam was invented in 1804 and forever changed the way human beings would relate to distance. Distant cities would become reachable in days or even hours, and the rail was affordable enough to allow even the poorer classes to travel extensively and find better prospects for work or better locations to settle (a before perilous and extremely difficult task). Before the steam train, the fastest mode of overland transportation was the mail coach, pulled by a team of fresh horses, which could reach the breathtaking pace of 7 miles per hour. When steam locomotion entered the scene, its speeds initially terrified people, and many wondered if there would be health problems brought on by a speed so unnatural to human life. And yet, as the technology developed, the railway became the best interconnection tool for humanity, a way for us to bridge the gaps between city life and rural, poor and wealthy, near and far.

Following this train of thought

Image
Wikimedia

Early predecessors of railways were put in place to help miners get coal and minerals out of deep tunnels as early as the mid-1500s. First, wooden rails were used, later be replaced by iron, and heavy loads were pulled by horsepower. However, we can go back even farther to ancient Mediterranean civilizations like Babylon and find early examples of the concept, with tracks carved into stone that would allow wagons to travel predefined routes.

The steam engine, designed in 1712 by a British mining engineer named Thomas Newcomen, was originally a simple device designed to support faster mining. Newcomen’s design received later improvements by the Scottish inventor James Watt who patented a design for a steam locomotive in 1784. What followed were twenty-odd years of innovation from different engineers, mechanics, and industrial explorers, aiming to build a functional steam locomotive that could haul loads and become commercially viable.

It would take just over a hundred years for the invention of the first steam engine to become the first public, steam-only, railway line, but in 1829 the Liverpool and Manchester Railway gained that honor, and for the next hundred years steam-powered trains would dominate travel across the world (though electrically-powered trains also became popular and were in regular use in some locations as early as 1891).

Diesel-electric trains (locomotives which use diesel engines to power electric generators) were popular for many years, supplanting steam-powered engines. They were a staple for national transport in many countries worldwide before being phased out throughout much of civilized Europe and elsewhere in favor of electric trains. Sadly, in the United States, where infrastructure has lagged desperately behind continental Europe for decades, the environmentally detrimental diesel engine continues to be used.

Electric trains and hydrogen-powered trains are now the preferred means of transport for all nations where railways are maintained by the public. Since the United States continues to operate on the outdated privatized public infrastructure model for its railways, the train system has lagged, but elsewhere great strides are being made to further improve and extend green railway access as the best alternative to both automobile and air travel.

Washer and wringer
Picryl

How do you wash your clothes? This is something that, for centuries, the people of the world occupied a rather extreme portion of their time to. Especially if you were poor, you might not only wash your own clothes and those of your family but also you might wash the clothes of those higher than you on the social ladder. Disproportionately, too, this was “women’s work,” a job looked down on as menial and beneath notice. And yet, now, I’d be willing to bet that everyone reading this has access to a washing machine. If not personally owned, then certainly made available at a local laundromat or professional washing service.

If you fall in that category, you are around 2 billion others who share this luxury. Out of the nearly eight billion on the planet, some more than five billion still wash their clothes exclusively by hand, usually with water painstakingly pulled from local sources and heated over fires. According to a 2010 TED talk by Hans Rosling, data analyst, and health expert, this difficult task still consumes many hours each week for (mostly women) around the world.

So, how did the invention of the washing machine come to be? How did it morph from a prized rarity in even wealthy countries into something taken for granted by so many?

Cleaning up the history

Image
Picryl

As early as 1691, there were patents in Britain for the washing machine, then a device ultimately powered by hand, but using machine principles to dramatic effect. Jacob Christian Schäffer, a German professor and entomologist, invented and patented an early drum design for the washing machine in 1767. Most of these devices were variations on a rotating drum theme, turned by a crank mechanism. Despite the invention of steam power in the 1700s, it would take another 200 years for steam power to finally reach the world of washing technology.

Through the late 19th century, further developments on the washing machine were made, including the invention of machines that used rollers to remove excess water from the clothing as part of the washing process.

Oliver Woodrow is the first to show a patent in the US Patent Office for a “driving mechanism” for a washing machine powered by electricity. By 1928 there would be almost a million sales a year (dropping to almost half that during the Great Depression). Still, the Depression also led to the proliferation of laundromats, where washing machines could be rented.

In combination with new technology for spin-drying, washing machines grew in popularity through the 1940s alongside the proliferation of widespread energy grids. Throughout this period washing machines that used electric timers also started to be introduced, and the innovation of the first automatic washing machines, which required little to no intervention to operate, was born.

Plane
Picryl

Our story begins over a thousand years ago, as most good stories do.

The earliest known devices for flight exited in China in the form of bamboo rotary toys for children and kites, which were used to lift men off the ground as early as the 4th century C.E. Likewise, there were lanterns propelled up by hot air in use in China as early as the 3rd century B.C.E.

Somewhere around 810-887 C.E., a man named Abbas ibn Firnas, or Arman Firman, made history. Of Andalusian descent, Abbas was a poet and polymath, talented in many areas of the arts and sciences of his time. By strapping wings to his arms, Abbas was able to jump from a height and glide a certain distance in what may be the first recorded attempt at heavier-than-air flight.

How it all took off

Image

Leonardo da Vinci’s wonderful sketches of wing-like contraptions and rotary flying devices have proliferated the public consciousness to some degree. Though the brilliant Italian polymath worked on designs for flying machines and even anticipated many Newtonian physics principles, his flight concepts were never utilized for the exploration of flight.

It was between the 1670s and early 1700s that efforts to understand and command the power of flight. Several early airship designs accompanied by the mathematical work of Francesco Lana de Terzi make this century especially important for the rise of hot air balloons that would take place in the late half of the 18th century.

After balloons came airships, dirigible (or steerable) balloons that could carry large loads and make long—distance travel through the air possible.

Image
Picryl

Sir George Cayley, by 1846, had gained the title of “father of the airplane” for both his study of the physics of flight and his designs of the first heavier-than-air craft. The 1800s saw many sudden leaps forward in technology, especially thanks to the adoption of steam power. In 1848 the first uncrewed steam-powered vehicle in history attained flight.

Through the late 1880s, major advancements in the understanding of aerodynamics took place. With the 1871 invention of the wind tunnel (used to test models of aircraft), a huge step forward was taken toward systematically designing aircraft. Several French inventors made sustained headway, creating gliders and early powered aircraft that could sustain limited lift and flight over short distances.

Using steam engines proved difficult, however, largely due to their mass and complicated design, so heavy emphasis remained on better understanding the physics of flight through gliding. Though several other attempts in the late 1890s and early 1900s were successful at putting powered, uncrewed vehicles in the air, it was not until the Wright brothers in 1903 that the sustained crewed flight of a powered vehicle became a reality. Even then, independent innovation continued, with the Wright brothers’ inventions not becoming proliferated until later.

After that, spurred by two world wars and growing civilian interest, airplanes took off to become the dominant form of travel, as well as the dominant military technology for the next century. And yet, it all started with the guts to jump from high places and a natural human obsession with a load of hot air.

Flint Blade arrow
Picryl

Of all human inventions, few are more vital to the proliferation of the species across Earth's face than that of stone-tool technology. This was once the height of intelligent, innovative ability. It paved the way for everything that would come later, from metalworking to the Internet.

The earliest inventors got everywhere by knapping

Image
Picryl

Stone tools are classified into industries that share distinctive characteristics. What follows is a generic overview of these modes intended to showcase both the timespan of these inventions and their proliferation, as well as the complicated nature of the early human invention.

Pre-Mode I tools predate the homo genus altogether, which is to say that they predate humanity. These date back as far as 3.3 million years and are attributed to Australopithecus, an evolutionary ancestor to modern humans.

Mode II tools, known as the Oldowan Industry, were simple stone tools usually created by chipping away at one stone with another stone to form edges that could be used for cutting or digging. Mode II tools date back to around 2.6 million years ago and became widespread throughout Africa, which allowed our sibling species Homo habilis and Homo erectus to inherit the methods involved in this early technology.

Mode III, or the Mousterian Industry, was a method of knapping used primarily by Homo neanderthalensis and involved a more concentrated approach to stone technology. After a core substance had been initially knapped, it would then be gone over again for additional refinement, allowing sharper tools to be created.

Mode IV, the Aurignacian industry, saw the first widespread use of “blades” (thin pieces of chipped substances that allowed for much greater sharpness and range of cutting applicability.

Image
Picryl

Tying it all together

Mode V is known as the Microlithic Industries and involved the use of composite tools, usually in the form of stone attached to wood or bone shafts and handles. This dramatically increased the utility of the tools available. This advancement in stone technology began around only 35,000 years ago, making it a quite recent invention.

The period of Neolithic industries saw the widespread proliferation and use of ground stone tools around 10,000 BCE. These ground tools must have been an extremely intensive item to construct and would have been incredibly valuable to the societies that commanded them. By grinding certain stones that resisted flaking against other stones with rough surfaces, sometimes with a lubricant such as water involved, shapes could be formed from the raw material and then used for a wide variety of purposes.

But stone-tool innovation is not just a pre-historic element. As late as the 1600s, specialized stoneworking was required to create flintlock pistols, and even today, there are some instances where obsidian surgical knives are found in use for delicate surgery.

What could be more welcome than horror for the whole family?
Odin Odin (181)
0

I've collected a list of top-notch movies on Netflix for the whole family just in time for this Halloween!