Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
In which we look at the past, present, and near-term future of the incredible new machines that threaten to overwhelm us - their creators.1
It's easy to point at stories of tech-gone-wild like Black Mirror, The Circle, and The Terminator and say, "Spooky! But that's just fiction." The nonfiction version, though, is more insidious and terrifying. Like a modern version of Shelley's Frankenstein monster, most of the parts of our new Monster are already assembled on the lab table. All we need is the right kind of spark to get it twitching, off the table, and chasing us across town with bad intentions.
These sparks are already flying. Coronavirus and social unrest are adding nitroglycerin.
People with scopophobia (the fear of being watched) are going to have a tough couple of decades. If you assume only public cameras and web browsers are tracking your actions and thoughts, we've got some bad news. As we predicted in our first book, Code Halos (published in 2014), literally every thing is now generating code - data - about you, your parents, your kids, your dog, your car, your home.
Where you go is tracked by the mobile surveillance device in your pocket that we still call a phone (which is kind of quaint and adorable). Toys your children play with are spitting out data to help fine-tune targeted kiddo marketing. If your car has a telematics device, your insurance company can know when you're trying to drive after too much wine. One telco leader we know won't use all her company's products simply because she knows how hi-res the picture would be if they connected her browser data, viewing habits, phone history, and smart-home data. Your power company knows when you are cooking, when you open your refrigerator for that late-night calorie binge, how much laundry you do. Clearview AI's software can input a picture of you and connect it with all the other photos of you among a data set of more than three billion other images harvested from social media (most of which you shared). Proponents of contact tracing - in the ascendancy as we write - will suggest that use of this technology goes from voluntary to mandatory for the "health of the many." In doing so, surveillance will burst from the shadows and, now more legal and emboldened, strut around the stage on full display.
Virtually every employable skill, the common narrative goes, will be replaced by AI run amok. Our jobs will be automated away, leaving us in a dystopian nightmare scenario where the only human jobs will be oiling and maintaining the Terminators. Driverless cars, doctor-less surgery, chef-less meals, soldier-less armies, teacher-less schools aren't really around the corner, but many people are nearly paralyzed about the potential. As we explained in Machines, much of this fear is overblown, but the hard truth is, when it comes to robots and jobs, there will be blood. .
Speaking of blood, the stylized sci-fi of Slaughterbots and Terminators might seem troublingly prescient, but the most likely future of conflict is even more frightening. Hybrid war with smaller groups of special forces operators, pocket nukes, IEDs, lasers, drones, space-based weapons, and rail guns linked with AI technology is warping how power will be exerted. Perhaps more importantly, this smaller scale, easily assembled but highly intelligent and automated weaponry is lowering the barriers to entry to the Big War Show. Nobody needs a particle accelerator or nuclear-powered aircraft carrier to create a massive swarm of unmanned aerial vehicles that link facial recognition software, consumer-grade drones, and small explosive charges. That horsefly buzzing around? It's actually a miniature drone packed with C4 explosives and AI targeting your face. The internet + AI + quantum computing is not online yet, but the prospect is straight up terrifying.
Scared yet? Us too. But how did this happen? It's certain that nobody planned to build a technology monster. To understand how we got here, and how we can regain control, we need to go back to the beginning.
In 1989, in a small, non-descript office at the European Organization for Nuclear Research, perched on the border of Switzerland and France, a youngish self-proclaimed trainspotter-type nerd wrote a document outlining how the idea of "hypertext" (developed by computer scientist Ted Nelson in 1963) could be overlaid on top of the "Transmission Control Protocol" (developed by networking specialists Vint Cerf and Bob Kahn in 1974) to create something he called the "World Wide Web."
The document, authored by one Tim Berners-Lee, was initially met with head-scratching. His immediate supervisor called it "vague." Within months, though, it began circulating through computer circles like wildfire. It quickly reached academics at the University of Illinois at Urbana-Champaign, entrepreneurs, and politicians in Washington, DC, and venture capitalists in Silicon Valley.
The rest, as historians say, is history.
In 2014, in a list produced by the British Council of 80 moments that shaped the world in the last 80 years, the birth of TBL's baby was ranked #1.2
Fast-forward to today, and what hasn't the internet touched? Anything, anybody? We've got nada. .
Of course, the internet, on which the World Wide Web rests, had been developing for decades, but Berners-Lee's insights took what had been a military-academic curio and jammed it right into the middle of everything. And nothing was ever the same again.
The internet/World Wide Web (simply "the internet" in the rest of this manuscript) has become the hub around which every other technology in today's world revolves. Immense global cloud computers, ubiquitous connectivity, supercomputers in our pockets, machines starting to walk and talk, 5G networks rolling out to turbo-charge our connected lives - none of these would have the power they do without the internet at the core. The internet has become the world's central nervous system. And its spine. And its brain.
Obviously, Mr. Berners-Lee can't take all the credit (or the blame). Many, many other brilliant men and women have given birth to parts of the modern tech world and been instrumental in turning science fiction into science fact. Space prohibits too much of that history being revisited here, but there is one person and one idea that deserve special mention in this brief review of how we got where we got - Gordon Moore and his eponymous law.
Moore's Law states that the number of transistors on a computer chip will double about every two years. Simply put, computers continue to get more and more powerful more and more quickly. Never has a law been so aptly named.
Proposed in 1965, when most people had very little idea of what a computer really was, let alone interacted with one, Moore's prediction was a remarkably insightful signpost to the future that set off the arms race we continue to run to this day. In 1965, the IT "business" was a cottage industry that employed a few thousand people in white coats. Today, this business is a global behemoth with nary a white coat in sight.
As a rallying cry and a goal, Moore's Law acted to galvanize and organize computer-based research and development around the world. But nobody really knew - including Moore himself - how long the law would remain relevant. If you'd said 55 years, Moore would have taken that bet. If you'd said indefinitely, even Moore would have raised a quizzical eyebrow.
But here we are today, and although intuitively it feels like our ability to build more powerful machines should start to slow down as we reach the limits of physics (the sheer size of atoms and electrons) and economics (rising production costs as integrated circuits get increasingly packed), new "laws" related to new technologies are coming into effect to keep innovation going (and perhaps accelerating).
One of the most powerful, and potentially chilling, new laws governs the growth of quantum computing.
On YouTube, there's a short video that shows off IBM's quantum computer.3 Watching it, you can almost imagine how it felt hearing Marconi's first Morsecode tapping or seeing the first grainy images on a television screen just a few inches wide. After a few seconds of technical detail, we mostly focus on the sound of the gizmo in action. It has the rhythmic, mechanical swoosh-clunk of an MRI machine.
Inside, there is some kind of black art. To exhibit quantum properties, the processor must be chilled to 15 millikelvin, colder than outer space, nearly absolute zero. Frostbit atoms of niobium and silicon become almost completely motionless. And it's dark, shielded so that no light photons or magnetic fields can seep in to mess things up.4
And that's when things get really weird.
Unlike traditional computing, based on binary code comprised of zeros and ones, quantum computing introduces a new state - superposition - containing a phase of being both zero and one, which (along with other features like entanglement that even Einstein called "spooky")...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.